Apr 16 18:27:55.785959 ip-10-0-137-47 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:27:55.785975 ip-10-0-137-47 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:27:55.785984 ip-10-0-137-47 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:27:55.786271 ip-10-0-137-47 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:28:06.034975 ip-10-0-137-47 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:28:06.034991 ip-10-0-137-47 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a3132770020242fbaf247d4deee85d9f -- Apr 16 18:30:25.391697 ip-10-0-137-47 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:25.905047 ip-10-0-137-47 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:25.905047 ip-10-0-137-47 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:25.905047 ip-10-0-137-47 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:25.905047 ip-10-0-137-47 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:25.905047 ip-10-0-137-47 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:25.906073 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.905932 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:25.909143 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909121 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:25.909143 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909138 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:25.909143 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909144 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:25.909143 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909148 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:25.909143 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909152 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909156 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909160 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909164 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909167 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909176 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909180 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909184 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909188 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909191 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909196 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909199 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909203 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909206 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909210 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909214 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909219 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909233 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909237 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909241 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:25.909460 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909245 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909249 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909253 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909256 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909260 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909264 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909271 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909277 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909281 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909285 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909289 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909293 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909297 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909301 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909307 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909311 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909315 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909319 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909324 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:25.910251 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909329 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909333 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909338 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909342 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909346 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909350 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909354 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909358 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909362 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909367 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909371 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909375 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909381 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909386 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909389 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909410 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909414 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909418 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909422 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909426 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:25.911084 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909430 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909434 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909438 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909442 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909446 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909450 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909454 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909459 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909463 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909467 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909471 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909475 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909480 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909484 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909488 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909492 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909496 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909501 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909507 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:25.911667 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909513 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909518 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909522 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.909526 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910116 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910123 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910128 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910132 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910136 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910140 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910145 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910149 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910153 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910157 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910162 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910166 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910170 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910174 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910179 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910183 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:25.912127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910188 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910192 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910197 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910201 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910205 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910209 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910213 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910218 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910222 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910227 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910230 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910235 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910239 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910243 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910247 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910251 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910255 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910259 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910263 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:25.913015 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910266 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910270 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910274 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910278 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910282 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910286 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910292 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910296 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910300 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910305 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910308 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910312 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910316 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910321 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910326 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910330 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910335 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910339 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910346 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910353 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:25.913507 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910357 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910362 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910367 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910372 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910376 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910381 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910385 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910389 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910415 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910420 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910425 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910429 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910433 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910437 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910441 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910448 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910453 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910457 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910462 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:25.914062 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910466 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910470 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910474 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910478 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910482 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910486 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910492 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910497 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910501 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910505 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910509 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.910513 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910618 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910630 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910639 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910646 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910653 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910658 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910665 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910672 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910677 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:25.914620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910682 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910688 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910693 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910698 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910703 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910708 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910712 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910717 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910721 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910726 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910733 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910738 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910743 2576 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910747 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910752 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910758 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910763 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910769 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910775 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910780 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910784 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910789 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910794 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910799 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910806 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:25.915298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910813 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910818 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910823 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910827 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910832 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910839 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910844 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910849 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910853 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910858 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910865 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910869 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910874 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910879 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910884 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910889 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910893 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910898 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910903 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910907 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910912 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910918 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910923 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910927 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910933 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910939 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:25.915940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910944 2576 flags.go:64] FLAG: --help="false" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910949 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-47.ec2.internal" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910954 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910958 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910963 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910968 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910974 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910979 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910984 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910988 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910993 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.910998 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911003 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911007 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911012 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911017 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911022 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911026 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911031 2576 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911035 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911040 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911044 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911053 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:25.916716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911058 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911062 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911067 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911074 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911080 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911084 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911088 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911094 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911100 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911106 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911111 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911116 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911120 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911125 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911130 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911135 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911140 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911152 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911156 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911161 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911166 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911171 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911179 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911184 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:25.917274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911189 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911193 2576 flags.go:64] FLAG: --port="10250" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911198 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911203 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-035b202500178a7b5" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911208 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.911213 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912135 2576 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912144 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912149 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912156 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912161 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912167 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912171 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912177 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912182 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912187 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912192 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912197 2576 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912202 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912207 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912212 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912217 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912223 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912228 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912233 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912238 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:25.917852 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912243 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912248 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912253 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912258 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912263 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912268 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912272 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912299 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912306 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912311 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912319 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912324 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912329 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912334 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912339 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912343 2576 flags.go:64] FLAG: --v="2" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912349 2576 flags.go:64] FLAG: --version="false" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912356 2576 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912363 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.912368 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912540 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912548 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912554 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912558 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:25.918559 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912562 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912566 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912571 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912575 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912579 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912583 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912587 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912591 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912596 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912601 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912605 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912611 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912617 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912621 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912625 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912629 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912634 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912638 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912642 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912646 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:25.919125 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912651 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912655 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912658 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912662 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912666 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912671 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912676 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912680 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912684 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912693 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912697 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912701 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912714 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912719 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912724 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912728 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912732 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912737 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912741 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:25.919750 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912745 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912750 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912754 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912758 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912763 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912767 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912771 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912776 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912780 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912784 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912788 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912794 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912800 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912805 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912810 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912814 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912818 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912822 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912827 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912831 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:25.920212 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912835 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912839 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912845 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912849 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912853 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912857 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912861 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912865 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912869 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912873 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912878 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912882 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912886 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912891 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912896 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912901 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912906 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912910 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912915 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912919 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:25.920928 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912923 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:25.921590 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912926 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:25.921590 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.912930 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:25.921590 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.913680 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:25.922458 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.922440 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:25.922492 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.922459 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922506 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922511 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922515 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922518 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922521 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922525 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:25.922524 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922528 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922531 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922533 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922536 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922539 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922541 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922544 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922548 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922553 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922556 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922558 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922561 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922563 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922566 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922569 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922572 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922574 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922577 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922580 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:25.922699 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922582 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922585 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922587 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922591 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922593 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922596 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922599 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922601 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922604 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922606 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922609 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922611 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922614 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922616 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922618 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922622 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922625 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922628 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922631 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922633 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:25.923145 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922636 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922638 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922641 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922644 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922647 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922649 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922652 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922654 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922657 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922660 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922662 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922665 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922667 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922670 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922672 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922675 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922678 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922681 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922684 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922686 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:25.923651 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922689 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922691 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922694 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922696 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922699 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922704 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922708 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922712 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922715 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922718 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922721 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922724 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922727 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922729 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922732 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922735 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922737 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922740 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922742 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:25.924127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922744 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922747 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.922752 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922846 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922851 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922854 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922857 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922859 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922862 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922865 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922867 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922870 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922872 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922875 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922878 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:25.924604 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922880 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922883 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922885 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922888 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922891 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922894 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922897 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922899 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922902 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922905 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922908 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922910 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922913 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922915 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922918 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922921 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922923 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922925 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922928 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:25.924971 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922931 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922933 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922936 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922938 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922940 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922943 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922945 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922948 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922951 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922953 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922956 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922958 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922961 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922963 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922966 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922968 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922971 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922973 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922977 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922979 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:25.925442 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922982 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922984 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922987 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922989 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922993 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922995 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.922998 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923000 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923003 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923006 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923008 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923010 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923013 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923015 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923018 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923020 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923023 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923026 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923029 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923032 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:25.925921 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923035 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923038 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923040 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923043 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923045 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923048 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923050 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923053 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923055 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923058 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923060 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923063 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923065 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923069 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:25.923072 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.923077 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:25.926533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.923861 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:25.928450 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.928437 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:25.929503 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.929491 2576 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:25.929626 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.929609 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:25.929659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.929649 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:25.957859 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.957840 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:25.964028 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.964005 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:25.984201 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.984187 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:25.992799 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.992775 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:25.993529 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.993517 2576 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:25.994843 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.994826 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:25.999151 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.999133 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b94e9fff-45c7-4d47-9a44-2cc9435f932c:/dev/nvme0n1p4 fb893947-f85e-479a-9462-5da9071218c5:/dev/nvme0n1p3] Apr 16 18:30:25.999206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:25.999151 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:26.005085 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.004978 2576 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:26.003032424 +0000 UTC m=+0.469587300 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101398 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2156fd7cd2f1ed58f40c2bd72b0967 SystemUUID:ec2156fd-7cd2-f1ed-58f4-0c2bd72b0967 BootID:a3132770-0202-42fb-af24-7d4deee85d9f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:78:bc:34:2f:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:78:bc:34:2f:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:7d:7f:30:0b:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:26.005085 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005081 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:26.005186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005158 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:26.005540 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005513 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:26.005712 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005541 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:26.005794 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005726 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:26.005794 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005740 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:26.005794 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.005758 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:26.006636 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.006624 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:26.008408 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.008382 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:26.008534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.008523 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:26.011207 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.011196 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:26.012008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.011997 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:26.012057 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.012023 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:26.012057 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.012040 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:26.012057 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.012053 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:26.013267 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.013252 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:26.013342 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.013278 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:26.016938 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.016919 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:26.019247 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.019233 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:26.020907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020893 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:26.020956 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020917 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:26.020956 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020926 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:26.020956 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020934 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:26.020956 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020943 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:26.020956 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020951 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020959 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020968 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020978 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020987 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.020999 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:26.021092 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.021012 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:26.022241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.022228 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:26.022241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.022242 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:26.023266 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.023244 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:26.023361 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.023340 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:26.025926 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.025912 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:26.025993 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.025947 2576 server.go:1295] "Started kubelet" Apr 16 18:30:26.026058 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.026013 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:26.026130 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.026081 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:26.026179 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.026151 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:26.026616 ip-10-0-137-47 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:26.027560 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.027544 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:26.032920 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.032677 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:26.038494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.038476 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:26.038585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.038493 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:26.038983 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.038959 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039427 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039454 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.039431 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039503 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039510 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039567 2576 factory.go:55] Registering systemd factory Apr 16 18:30:26.039584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039584 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039690 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039750 2576 factory.go:153] Registering CRI-O factory Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039761 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039824 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039841 2576 factory.go:103] Registering Raw factory Apr 16 18:30:26.039952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.039850 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:26.040249 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.040223 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:26.040409 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.040380 2576 manager.go:319] Starting recovery of all containers Apr 16 18:30:26.040801 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.039324 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-47.ec2.internal.18a6e9da1bfca857 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-47.ec2.internal,UID:ip-10-0-137-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-47.ec2.internal,},FirstTimestamp:2026-04-16 18:30:26.025924695 +0000 UTC m=+0.492479572,LastTimestamp:2026-04-16 18:30:26.025924695 +0000 UTC m=+0.492479572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-47.ec2.internal,}" Apr 16 18:30:26.046695 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.046556 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:30:26.046846 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.046751 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:30:26.050542 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.050411 2576 manager.go:324] Recovery completed Apr 16 18:30:26.054534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.054517 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.057017 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.056994 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.057104 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.057035 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.057104 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.057049 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.057527 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.057515 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:26.057527 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.057527 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:26.057606 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.057542 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:26.060211 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.060199 2576 policy_none.go:49] "None policy: Start" Apr 16 18:30:26.060262 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.060215 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:26.060262 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.060225 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:26.060407 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.060331 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-47.ec2.internal.18a6e9da1dd71ab7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-47.ec2.internal,UID:ip-10-0-137-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-47.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-47.ec2.internal,},FirstTimestamp:2026-04-16 18:30:26.057018039 +0000 UTC m=+0.523572916,LastTimestamp:2026-04-16 18:30:26.057018039 +0000 UTC m=+0.523572916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-47.ec2.internal,}" Apr 16 18:30:26.063829 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.063801 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxnpk" Apr 16 18:30:26.070454 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.070430 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sxnpk" Apr 16 18:30:26.072349 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.072285 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-47.ec2.internal.18a6e9da1dd77611 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-47.ec2.internal,UID:ip-10-0-137-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-47.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-47.ec2.internal,},FirstTimestamp:2026-04-16 18:30:26.057041425 +0000 UTC m=+0.523596303,LastTimestamp:2026-04-16 18:30:26.057041425 +0000 UTC m=+0.523596303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-47.ec2.internal,}" Apr 16 18:30:26.103951 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.103934 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.103970 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.103983 2576 server.go:85] "Starting device plugin registration server" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.104222 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.104232 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.104325 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.104435 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.104446 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.104940 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:26.121346 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.104975 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.173323 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.173242 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:26.174656 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.174640 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:26.174749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.174670 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:26.174749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.174691 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:26.174749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.174700 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:26.174863 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.174741 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:26.180511 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.180489 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:26.204615 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.204598 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.205493 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.205461 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.205577 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.205501 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.205577 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.205512 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.205577 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.205534 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.214984 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.214967 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.215027 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.214988 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-47.ec2.internal\": node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.245136 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.245116 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.275739 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.275719 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal"] Apr 16 18:30:26.275792 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.275782 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.276660 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.276645 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.276718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.276673 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.276718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.276683 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.278904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.278893 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.279048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.279121 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279064 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.279531 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279517 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.279614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279537 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.279614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279546 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.279614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279522 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.279614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279584 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.279614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.279600 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.282068 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.282054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.282115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.282080 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:26.282703 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.282685 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:26.282780 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.282715 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:26.282780 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.282728 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:26.306010 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.305988 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-47.ec2.internal\" not found" node="ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.310322 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.310307 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-47.ec2.internal\" not found" node="ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.345983 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.345958 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.441917 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.441826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/17da44616c39894cc6f4732c6b243af1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-47.ec2.internal\" (UID: \"17da44616c39894cc6f4732c6b243af1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.441917 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.441861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.441917 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.441885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.446894 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.446874 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.542725 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/17da44616c39894cc6f4732c6b243af1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-47.ec2.internal\" (UID: \"17da44616c39894cc6f4732c6b243af1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.542725 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.542882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.542882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.542882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ce83f7e1bd6306ca8a455adfbee2f9ec-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal\" (UID: \"ce83f7e1bd6306ca8a455adfbee2f9ec\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.542882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.542795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/17da44616c39894cc6f4732c6b243af1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-47.ec2.internal\" (UID: \"17da44616c39894cc6f4732c6b243af1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.547789 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.547771 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.608994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.608961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.612459 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.612442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:26.648211 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.648185 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.748867 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.748790 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.849436 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.849407 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:26.926966 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.926936 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:26.930310 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.930291 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:26.930452 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.930435 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:26.930501 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:26.930435 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:26.949849 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:26.949825 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:27.039513 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.039456 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:27.050926 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:27.050899 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:27.057183 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.057161 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:27.072994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.072951 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:26 +0000 UTC" deadline="2027-12-25 22:34:29.7702302 +0000 UTC" Apr 16 18:30:27.072994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.072990 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14836h4m2.697242907s" Apr 16 18:30:27.079468 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.079450 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vhckn" Apr 16 18:30:27.087629 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.087609 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vhckn" Apr 16 18:30:27.105197 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:27.105157 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17da44616c39894cc6f4732c6b243af1.slice/crio-b3fb1ba34dcca8eb4086fddc2b5a52de4a8c3662f39d674927ceba3b5b625c91 WatchSource:0}: Error finding container b3fb1ba34dcca8eb4086fddc2b5a52de4a8c3662f39d674927ceba3b5b625c91: Status 404 returned error can't find the container with id b3fb1ba34dcca8eb4086fddc2b5a52de4a8c3662f39d674927ceba3b5b625c91 Apr 16 18:30:27.105807 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:27.105787 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce83f7e1bd6306ca8a455adfbee2f9ec.slice/crio-d100d1dc01de9ff75ade60234463d55344b710ed636506d8bb50c39e16c30416 WatchSource:0}: Error finding container d100d1dc01de9ff75ade60234463d55344b710ed636506d8bb50c39e16c30416: Status 404 returned error can't find the container with id d100d1dc01de9ff75ade60234463d55344b710ed636506d8bb50c39e16c30416 Apr 16 18:30:27.111102 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.111088 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:27.151315 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:27.151296 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:27.178178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.178128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" event={"ID":"ce83f7e1bd6306ca8a455adfbee2f9ec","Type":"ContainerStarted","Data":"d100d1dc01de9ff75ade60234463d55344b710ed636506d8bb50c39e16c30416"} Apr 16 18:30:27.179106 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.179084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" event={"ID":"17da44616c39894cc6f4732c6b243af1","Type":"ContainerStarted","Data":"b3fb1ba34dcca8eb4086fddc2b5a52de4a8c3662f39d674927ceba3b5b625c91"} Apr 16 18:30:27.205835 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.204743 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:27.252169 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:27.252143 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:27.352708 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:27.352678 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-47.ec2.internal\" not found" Apr 16 18:30:27.404292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.404266 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:27.439057 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.439027 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" Apr 16 18:30:27.449287 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.449257 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:27.452644 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.452619 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" Apr 16 18:30:27.471190 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.471167 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:27.979021 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:27.978989 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:28.014180 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.014150 2576 apiserver.go:52] "Watching apiserver" Apr 16 18:30:28.023444 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.023418 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:28.023813 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.023785 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-j85qr","kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal","openshift-image-registry/node-ca-9k6mz","openshift-multus/multus-w4gwm","openshift-multus/network-metrics-daemon-n66hf","openshift-network-diagnostics/network-check-target-qbq69","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss","openshift-cluster-node-tuning-operator/tuned-ljxdq","openshift-dns/node-resolver-rs8w8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal","openshift-multus/multus-additional-cni-plugins-ll6hq","openshift-network-operator/iptables-alerter-d968n","openshift-ovn-kubernetes/ovnkube-node-tchmw"] Apr 16 18:30:28.028352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.028332 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.028588 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.028572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.030603 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.030580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.032847 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.032831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.032942 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.032895 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:28.035062 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.035041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:28.035156 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.035104 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:28.037179 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.037158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.038718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038698 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:28.038815 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.038815 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:28.038925 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038881 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jmdtp\"" Apr 16 18:30:28.038925 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038889 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.039020 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.039020 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038884 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.039020 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.038989 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.039343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.039330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.041709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.041691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.043510 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.043469 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:28.043696 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.043678 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.043823 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.043793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9rx88\"" Apr 16 18:30:28.043890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.043823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-sqt88\"" Apr 16 18:30:28.043890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.043866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:28.044163 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:28.044255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.044314 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:28.044467 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.044519 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xs87t\"" Apr 16 18:30:28.044738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044724 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.044738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.044734 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.045275 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.045260 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dxrxn\"" Apr 16 18:30:28.047239 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.047187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.047834 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.047819 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.048630 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.048611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-s267k\"" Apr 16 18:30:28.049763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.049747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.051206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.050880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.051206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.050948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crsvl\" (UniqueName: \"kubernetes.io/projected/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kube-api-access-crsvl\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.051206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.050982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysconfig\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.051206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-socket-dir-parent\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-kubelet\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-etc-kubernetes\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dffbf089-0f9c-412d-8cef-d3e8343e0951-konnectivity-ca\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-conf\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-netns\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-device-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.051525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.051748 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-lib-modules\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.051748 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk6sb\" (UniqueName: \"kubernetes.io/projected/129c086c-bc70-4407-a43e-26664dfb816c-kube-api-access-pk6sb\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.051748 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-cnibin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-os-release\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.051920 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-socket-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.051971 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-systemd\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.051969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-run\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-host\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052078 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/129c086c-bc70-4407-a43e-26664dfb816c-tmp-dir\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.052116 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwtd\" (UniqueName: \"kubernetes.io/projected/e3170e08-0669-40fc-b2a2-105f865f2be9-kube-api-access-gtwtd\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.052242 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.052301 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-registration-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.052301 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-host\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.052417 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-k8s-cni-cncf-io\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.052417 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-bin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.052529 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dffbf089-0f9c-412d-8cef-d3e8343e0951-agent-certs\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.052584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:28.052722 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-modprobe-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052778 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-sys\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-tuned\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh827\" (UniqueName: \"kubernetes.io/projected/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-kube-api-access-lh827\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.052933 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/129c086c-bc70-4407-a43e-26664dfb816c-hosts-file\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.052933 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-system-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.052933 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-multus\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-kube-api-access-mdhjz\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.053075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-kubernetes\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.053075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.052999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-var-lib-kubelet\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.053075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-cni-binary-copy\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-conf-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdsg\" (UniqueName: \"kubernetes.io/projected/e8425304-94d1-408f-ac22-f5bb6adfce75-kube-api-access-lbdsg\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-etc-selinux\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-sys-fs\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-serviceca\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-tmp\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-hostroot\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053266 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-daemon-config\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053294 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:28.053302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-multus-certs\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.053762 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.053312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:28.057856 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.057834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.058762 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.058746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:28.058960 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.058942 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.059055 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:28.059121 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:28.059121 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059108 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-v56sb\"" Apr 16 18:30:28.059245 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:28.059370 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059349 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:28.059488 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059437 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-bp4wk\"" Apr 16 18:30:28.059578 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059520 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:28.059643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.059621 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:28.063174 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.063157 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzmbh\"" Apr 16 18:30:28.088738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.088712 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:27 +0000 UTC" deadline="2027-12-04 18:45:16.874118225 +0000 UTC" Apr 16 18:30:28.088738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.088736 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14328h14m48.785384838s" Apr 16 18:30:28.141136 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.141107 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:28.154094 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-modprobe-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-sys\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-tuned\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-system-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-sys\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-kubelet\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-modprobe-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-ovn\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-system-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-kube-api-access-mdhjz\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-kubernetes\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-conf-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-cnibin\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-host-slash\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154525 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-conf-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-sys-fs\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-serviceca\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-tmp\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-sys-fs\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-socket-dir-parent\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.154737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dffbf089-0f9c-412d-8cef-d3e8343e0951-konnectivity-ca\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpvnb\" (UniqueName: \"kubernetes.io/projected/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-kube-api-access-fpvnb\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-conf\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-socket-dir-parent\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-slash\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-etc-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-device-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.154984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-cnibin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-os-release\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-netd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155087 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-serviceca\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-socket-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-os-release\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.155520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-d\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-host\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dffbf089-0f9c-412d-8cef-d3e8343e0951-konnectivity-ca\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-systemd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-registration-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-socket-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-bin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-netns\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-cnibin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysctl-conf\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-log-socket\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lh827\" (UniqueName: \"kubernetes.io/projected/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-kube-api-access-lh827\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-registration-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-device-dir\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/129c086c-bc70-4407-a43e-26664dfb816c-hosts-file\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.156215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-kubernetes\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-multus\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-host\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-bin\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-bin\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.155611 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmsv2\" (UniqueName: \"kubernetes.io/projected/08cb14f4-383f-4b43-8944-b2fe93cf6dff-kube-api-access-vmsv2\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qczqj\" (UniqueName: \"kubernetes.io/projected/f2a54163-a62f-47da-993d-f3471a740635-kube-api-access-qczqj\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/129c086c-bc70-4407-a43e-26664dfb816c-hosts-file\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-cni-multus\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.155802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:28.655767861 +0000 UTC m=+3.122322768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-var-lib-kubelet\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-cni-binary-copy\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbdsg\" (UniqueName: \"kubernetes.io/projected/e8425304-94d1-408f-ac22-f5bb6adfce75-kube-api-access-lbdsg\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovn-node-metrics-cert\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.155992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-etc-selinux\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-hostroot\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.156992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-daemon-config\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156066 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-multus-certs\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crsvl\" (UniqueName: \"kubernetes.io/projected/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kube-api-access-crsvl\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysconfig\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-kubelet\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-etc-selinux\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-etc-kubernetes\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-hostroot\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-config\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-cni-binary-copy\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-sysconfig\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-var-lib-kubelet\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-var-lib-kubelet\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-etc-kubernetes\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-system-cni-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-os-release\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.157763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-multus-certs\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-netns\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-daemon-config\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-systemd-units\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-env-overrides\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-netns\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-lib-modules\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-multus-cni-dir\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk6sb\" (UniqueName: \"kubernetes.io/projected/129c086c-bc70-4407-a43e-26664dfb816c-kube-api-access-pk6sb\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-node-log\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-lib-modules\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-iptables-alerter-script\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-systemd\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-run\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.156980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/129c086c-bc70-4407-a43e-26664dfb816c-tmp-dir\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwtd\" (UniqueName: \"kubernetes.io/projected/e3170e08-0669-40fc-b2a2-105f865f2be9-kube-api-access-gtwtd\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-systemd\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dffbf089-0f9c-412d-8cef-d3e8343e0951-agent-certs\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-host\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-k8s-cni-cncf-io\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-var-lib-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-script-lib\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/129c086c-bc70-4407-a43e-26664dfb816c-tmp-dir\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-host\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-run\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.157450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e3170e08-0669-40fc-b2a2-105f865f2be9-host-run-k8s-cni-cncf-io\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.158332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-tmp\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.158976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.158382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-etc-tuned\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.159745 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.159726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dffbf089-0f9c-412d-8cef-d3e8343e0951-agent-certs\") pod \"konnectivity-agent-j85qr\" (UID: \"dffbf089-0f9c-412d-8cef-d3e8343e0951\") " pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.164170 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.164150 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:28.164251 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.164175 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:28.164251 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.164188 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:28.164332 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.164254 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:28.664234714 +0000 UTC m=+3.130789599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:28.166423 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.166384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdhjz\" (UniqueName: \"kubernetes.io/projected/1c74f02e-39bc-4ee2-bd6c-07d23ece32a2-kube-api-access-mdhjz\") pod \"node-ca-9k6mz\" (UID: \"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2\") " pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.167869 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.167820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwtd\" (UniqueName: \"kubernetes.io/projected/e3170e08-0669-40fc-b2a2-105f865f2be9-kube-api-access-gtwtd\") pod \"multus-w4gwm\" (UID: \"e3170e08-0669-40fc-b2a2-105f865f2be9\") " pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.170299 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.170242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh827\" (UniqueName: \"kubernetes.io/projected/f0b61b79-c7e3-4bac-b7fc-82e6ba400420-kube-api-access-lh827\") pod \"tuned-ljxdq\" (UID: \"f0b61b79-c7e3-4bac-b7fc-82e6ba400420\") " pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.170427 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.170337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crsvl\" (UniqueName: \"kubernetes.io/projected/292f88d2-b6d0-4f0b-95e8-aad4c415fc43-kube-api-access-crsvl\") pod \"aws-ebs-csi-driver-node-6njss\" (UID: \"292f88d2-b6d0-4f0b-95e8-aad4c415fc43\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.170427 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.170358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbdsg\" (UniqueName: \"kubernetes.io/projected/e8425304-94d1-408f-ac22-f5bb6adfce75-kube-api-access-lbdsg\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.171717 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.171694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk6sb\" (UniqueName: \"kubernetes.io/projected/129c086c-bc70-4407-a43e-26664dfb816c-kube-api-access-pk6sb\") pod \"node-resolver-rs8w8\" (UID: \"129c086c-bc70-4407-a43e-26664dfb816c\") " pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.258526 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovn-node-metrics-cert\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258526 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258526 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-config\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-system-cni-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-os-release\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-systemd-units\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-env-overrides\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-system-cni-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-node-log\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-iptables-alerter-script\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.258774 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-var-lib-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-script-lib\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-node-log\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-os-release\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-kubelet\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-ovn\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-var-lib-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-systemd-units\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-cnibin\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.258996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-host-slash\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpvnb\" (UniqueName: \"kubernetes.io/projected/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-kube-api-access-fpvnb\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-slash\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259217 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-etc-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-netd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-systemd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-netns\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-log-socket\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-iptables-alerter-script\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-bin\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-config\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmsv2\" (UniqueName: \"kubernetes.io/projected/08cb14f4-383f-4b43-8944-b2fe93cf6dff-kube-api-access-vmsv2\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-host-slash\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qczqj\" (UniqueName: \"kubernetes.io/projected/f2a54163-a62f-47da-993d-f3471a740635-kube-api-access-qczqj\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-ovn\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-cnibin\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-kubelet\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-ovn-kubernetes\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-slash\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.259994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-etc-openvswitch\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-run-systemd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-log-socket\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-bin\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2a54163-a62f-47da-993d-f3471a740635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-run-netns\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/08cb14f4-383f-4b43-8944-b2fe93cf6dff-host-cni-netd\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.259964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.260302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-env-overrides\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.260371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovnkube-script-lib\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.260716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.260643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2a54163-a62f-47da-993d-f3471a740635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.261126 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.261105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/08cb14f4-383f-4b43-8944-b2fe93cf6dff-ovn-node-metrics-cert\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.268497 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.268461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qczqj\" (UniqueName: \"kubernetes.io/projected/f2a54163-a62f-47da-993d-f3471a740635-kube-api-access-qczqj\") pod \"multus-additional-cni-plugins-ll6hq\" (UID: \"f2a54163-a62f-47da-993d-f3471a740635\") " pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.268874 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.268856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmsv2\" (UniqueName: \"kubernetes.io/projected/08cb14f4-383f-4b43-8944-b2fe93cf6dff-kube-api-access-vmsv2\") pod \"ovnkube-node-tchmw\" (UID: \"08cb14f4-383f-4b43-8944-b2fe93cf6dff\") " pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.268988 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.268969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpvnb\" (UniqueName: \"kubernetes.io/projected/4c92cc7a-a8d8-4824-ac8c-b83aca2188a9-kube-api-access-fpvnb\") pod \"iptables-alerter-d968n\" (UID: \"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9\") " pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.338949 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.338916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" Apr 16 18:30:28.346737 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.346711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9k6mz" Apr 16 18:30:28.357358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.357338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4gwm" Apr 16 18:30:28.362070 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.362052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:28.368411 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.368382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" Apr 16 18:30:28.374888 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.374869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rs8w8" Apr 16 18:30:28.382407 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.382376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" Apr 16 18:30:28.389919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.389898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-d968n" Apr 16 18:30:28.395571 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.395552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:28.661636 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.661600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:28.661825 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.661729 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:28.661825 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.661806 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.661786712 +0000 UTC m=+4.128341593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:28.762084 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:28.762051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:28.762222 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.762191 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:28.762222 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.762207 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:28.762222 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.762216 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:28.762338 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:28.762266 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.762251798 +0000 UTC m=+4.228806661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:28.786833 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.786745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c92cc7a_a8d8_4824_ac8c_b83aca2188a9.slice/crio-f7b9cc7583f00f8a8134ff3147ee64cc671d1c21167206302ddfa325ab7c1e3b WatchSource:0}: Error finding container f7b9cc7583f00f8a8134ff3147ee64cc671d1c21167206302ddfa325ab7c1e3b: Status 404 returned error can't find the container with id f7b9cc7583f00f8a8134ff3147ee64cc671d1c21167206302ddfa325ab7c1e3b Apr 16 18:30:28.789526 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.789504 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod129c086c_bc70_4407_a43e_26664dfb816c.slice/crio-be40d7a922db78b9304a38e91e6fde5c732e15fef7e241c3951b6f81fa41115b WatchSource:0}: Error finding container be40d7a922db78b9304a38e91e6fde5c732e15fef7e241c3951b6f81fa41115b: Status 404 returned error can't find the container with id be40d7a922db78b9304a38e91e6fde5c732e15fef7e241c3951b6f81fa41115b Apr 16 18:30:28.790675 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.790655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0b61b79_c7e3_4bac_b7fc_82e6ba400420.slice/crio-5dc18993cd3a2f70b91f518e20e5ca8290abdee5872568eccb2f047f6f41bfb2 WatchSource:0}: Error finding container 5dc18993cd3a2f70b91f518e20e5ca8290abdee5872568eccb2f047f6f41bfb2: Status 404 returned error can't find the container with id 5dc18993cd3a2f70b91f518e20e5ca8290abdee5872568eccb2f047f6f41bfb2 Apr 16 18:30:28.794613 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.794590 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c74f02e_39bc_4ee2_bd6c_07d23ece32a2.slice/crio-686c5451a3798e3b5775ca4a3e8b23657b1d1f324b2ad4ba57e4b918bc57c8e7 WatchSource:0}: Error finding container 686c5451a3798e3b5775ca4a3e8b23657b1d1f324b2ad4ba57e4b918bc57c8e7: Status 404 returned error can't find the container with id 686c5451a3798e3b5775ca4a3e8b23657b1d1f324b2ad4ba57e4b918bc57c8e7 Apr 16 18:30:28.795626 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.795545 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a54163_a62f_47da_993d_f3471a740635.slice/crio-87dedab5664a700b8a0836c7665c816e343b88f9e115393ade47a24ca0a7fa60 WatchSource:0}: Error finding container 87dedab5664a700b8a0836c7665c816e343b88f9e115393ade47a24ca0a7fa60: Status 404 returned error can't find the container with id 87dedab5664a700b8a0836c7665c816e343b88f9e115393ade47a24ca0a7fa60 Apr 16 18:30:28.796086 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.796065 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddffbf089_0f9c_412d_8cef_d3e8343e0951.slice/crio-619248cf9e896a8796d3eb6d97d349344ea85f50d9b60793bd72c6bb19145c4c WatchSource:0}: Error finding container 619248cf9e896a8796d3eb6d97d349344ea85f50d9b60793bd72c6bb19145c4c: Status 404 returned error can't find the container with id 619248cf9e896a8796d3eb6d97d349344ea85f50d9b60793bd72c6bb19145c4c Apr 16 18:30:28.796735 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.796705 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08cb14f4_383f_4b43_8944_b2fe93cf6dff.slice/crio-e0c992b4ad51a32e06c9e84f93d2a7677602cf2570b51ab77a6b0f62cee2f225 WatchSource:0}: Error finding container e0c992b4ad51a32e06c9e84f93d2a7677602cf2570b51ab77a6b0f62cee2f225: Status 404 returned error can't find the container with id e0c992b4ad51a32e06c9e84f93d2a7677602cf2570b51ab77a6b0f62cee2f225 Apr 16 18:30:28.819127 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.819097 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3170e08_0669_40fc_b2a2_105f865f2be9.slice/crio-f88e68c22b82cc8b6ab3e18e713f91443a2d59704fa81edf9a68e0de51e6de83 WatchSource:0}: Error finding container f88e68c22b82cc8b6ab3e18e713f91443a2d59704fa81edf9a68e0de51e6de83: Status 404 returned error can't find the container with id f88e68c22b82cc8b6ab3e18e713f91443a2d59704fa81edf9a68e0de51e6de83 Apr 16 18:30:28.819676 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:30:28.819652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292f88d2_b6d0_4f0b_95e8_aad4c415fc43.slice/crio-73e243a062f36dfc9843636fe58fbbbce172e5727e8bc5d0f569a42791276570 WatchSource:0}: Error finding container 73e243a062f36dfc9843636fe58fbbbce172e5727e8bc5d0f569a42791276570: Status 404 returned error can't find the container with id 73e243a062f36dfc9843636fe58fbbbce172e5727e8bc5d0f569a42791276570 Apr 16 18:30:29.089900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.089733 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:27 +0000 UTC" deadline="2028-01-14 12:46:24.070354764 +0000 UTC" Apr 16 18:30:29.089900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.089894 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15306h15m54.980463481s" Apr 16 18:30:29.185716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.185652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j85qr" event={"ID":"dffbf089-0f9c-412d-8cef-d3e8343e0951","Type":"ContainerStarted","Data":"619248cf9e896a8796d3eb6d97d349344ea85f50d9b60793bd72c6bb19145c4c"} Apr 16 18:30:29.193528 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.193460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" event={"ID":"f0b61b79-c7e3-4bac-b7fc-82e6ba400420","Type":"ContainerStarted","Data":"5dc18993cd3a2f70b91f518e20e5ca8290abdee5872568eccb2f047f6f41bfb2"} Apr 16 18:30:29.201126 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.201090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" event={"ID":"292f88d2-b6d0-4f0b-95e8-aad4c415fc43","Type":"ContainerStarted","Data":"73e243a062f36dfc9843636fe58fbbbce172e5727e8bc5d0f569a42791276570"} Apr 16 18:30:29.205361 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.205283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"e0c992b4ad51a32e06c9e84f93d2a7677602cf2570b51ab77a6b0f62cee2f225"} Apr 16 18:30:29.207811 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.207764 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerStarted","Data":"87dedab5664a700b8a0836c7665c816e343b88f9e115393ade47a24ca0a7fa60"} Apr 16 18:30:29.211112 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.211082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9k6mz" event={"ID":"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2","Type":"ContainerStarted","Data":"686c5451a3798e3b5775ca4a3e8b23657b1d1f324b2ad4ba57e4b918bc57c8e7"} Apr 16 18:30:29.221422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.221379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rs8w8" event={"ID":"129c086c-bc70-4407-a43e-26664dfb816c","Type":"ContainerStarted","Data":"be40d7a922db78b9304a38e91e6fde5c732e15fef7e241c3951b6f81fa41115b"} Apr 16 18:30:29.223191 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.223166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d968n" event={"ID":"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9","Type":"ContainerStarted","Data":"f7b9cc7583f00f8a8134ff3147ee64cc671d1c21167206302ddfa325ab7c1e3b"} Apr 16 18:30:29.226378 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.226353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" event={"ID":"17da44616c39894cc6f4732c6b243af1","Type":"ContainerStarted","Data":"854f2e93f59b25d0c19e8af52fe964400fe7a5cedc6365b1b9249ef294475d87"} Apr 16 18:30:29.228716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.228692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4gwm" event={"ID":"e3170e08-0669-40fc-b2a2-105f865f2be9","Type":"ContainerStarted","Data":"f88e68c22b82cc8b6ab3e18e713f91443a2d59704fa81edf9a68e0de51e6de83"} Apr 16 18:30:29.667875 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.667366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:29.667875 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.667518 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.667875 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.667572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:31.667555245 +0000 UTC m=+6.134110109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.769448 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.768931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:29.769448 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.769066 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:29.769448 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.769080 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:29.769448 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.769088 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:29.769448 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:29.769134 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:31.769120301 +0000 UTC m=+6.235675168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:29.807239 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:29.807211 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:30.177771 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:30.177743 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:30.178171 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:30.177873 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:30.178623 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:30.178385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:30.178749 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:30.178727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:30.239776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:30.239591 2576 generic.go:358] "Generic (PLEG): container finished" podID="ce83f7e1bd6306ca8a455adfbee2f9ec" containerID="fd191868c677ad6cd37f637680e608e946c8a2d9b31702d9f20a00333db1c3e3" exitCode=0 Apr 16 18:30:30.239776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:30.239676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" event={"ID":"ce83f7e1bd6306ca8a455adfbee2f9ec","Type":"ContainerDied","Data":"fd191868c677ad6cd37f637680e608e946c8a2d9b31702d9f20a00333db1c3e3"} Apr 16 18:30:30.256379 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:30.256307 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-47.ec2.internal" podStartSLOduration=3.256288376 podStartE2EDuration="3.256288376s" podCreationTimestamp="2026-04-16 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:29.245848486 +0000 UTC m=+3.712403375" watchObservedRunningTime="2026-04-16 18:30:30.256288376 +0000 UTC m=+4.722843262" Apr 16 18:30:31.256678 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:31.256633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" event={"ID":"ce83f7e1bd6306ca8a455adfbee2f9ec","Type":"ContainerStarted","Data":"1ab63f9fd733d42aa7e17d20312539777a466f520dd9da9df775583fed0b55fa"} Apr 16 18:30:31.285124 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:31.285079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-47.ec2.internal" podStartSLOduration=4.28505957 podStartE2EDuration="4.28505957s" podCreationTimestamp="2026-04-16 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:31.284374621 +0000 UTC m=+5.750929508" watchObservedRunningTime="2026-04-16 18:30:31.28505957 +0000 UTC m=+5.751614458" Apr 16 18:30:31.686198 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:31.686157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:31.686537 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.686468 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:31.686640 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.686543 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:35.686522734 +0000 UTC m=+10.153077615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:31.787039 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:31.786999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:31.787224 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.787178 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:31.787224 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.787200 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:31.787224 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.787211 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:31.787361 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:31.787279 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:35.787252683 +0000 UTC m=+10.253807567 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:32.177209 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:32.177180 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:32.177390 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:32.177310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:32.177390 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:32.177363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:32.177542 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:32.177452 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:34.176583 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:34.176546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:34.177058 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:34.176689 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:34.178306 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:34.178277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:34.178458 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:34.178367 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:35.717948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:35.717905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:35.718334 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.718102 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:35.718334 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.718160 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:43.718146007 +0000 UTC m=+18.184700871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:35.818883 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:35.818834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:35.819069 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.819005 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:35.819116 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.819079 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:35.819116 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.819094 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:35.819189 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:35.819148 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:43.819132547 +0000 UTC m=+18.285687411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:36.176089 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:36.176055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:36.176247 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:36.176102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:36.176247 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:36.176168 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:36.176541 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:36.176510 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:38.174985 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:38.174932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:38.175380 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:38.174994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:38.175380 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:38.175050 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:38.175380 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:38.175170 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:40.175477 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:40.175441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:40.176011 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:40.175573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:40.176011 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:40.175628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:40.176011 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:40.175726 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:42.175667 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:42.175626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:42.176122 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:42.175764 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:42.176122 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:42.175833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:42.176122 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:42.175952 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:43.776260 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:43.776222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:43.776703 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.776355 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:43.776703 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.776434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:59.776419153 +0000 UTC m=+34.242974021 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:43.876704 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:43.876665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:43.876894 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.876872 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:43.876943 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.876901 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:43.876943 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.876915 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:43.877007 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:43.876978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:30:59.876959586 +0000 UTC m=+34.343514452 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:44.175227 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:44.175189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:44.175453 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:44.175200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:44.175453 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:44.175344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:44.175453 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:44.175412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:46.175888 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.175799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:46.176527 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.175883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:46.176527 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:46.175944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:46.176527 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:46.176096 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:46.283076 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.282834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4gwm" event={"ID":"e3170e08-0669-40fc-b2a2-105f865f2be9","Type":"ContainerStarted","Data":"476870eddeeb86008884f9786107425721d643b716dd5dc70eb7c29beb61ea60"} Apr 16 18:30:46.284372 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.284341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j85qr" event={"ID":"dffbf089-0f9c-412d-8cef-d3e8343e0951","Type":"ContainerStarted","Data":"18f779c7d52fb39cc35b9a210a278f2cd2a517a4c0b4d40270217a718b17c91a"} Apr 16 18:30:46.285762 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.285740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" event={"ID":"f0b61b79-c7e3-4bac-b7fc-82e6ba400420","Type":"ContainerStarted","Data":"72568335c0c8c08be2a5d729feb0f44f861c1747054293a5fce5b887b42c75b5"} Apr 16 18:30:46.289914 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.289895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" event={"ID":"292f88d2-b6d0-4f0b-95e8-aad4c415fc43","Type":"ContainerStarted","Data":"9a7fa485bf53709ea6cf593ac5de88e48c501326abb836b9133133d069ea76c8"} Apr 16 18:30:46.291850 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.291813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"60e59b0b4b8fa11f8e0e083b4b3aa0bdb30cb4a86ebbbe9f1d2bf5c2b5fa4ece"} Apr 16 18:30:46.291850 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.291839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"e0fc2350efb573b9958e2609e9804001fd0350b54f78cbf22b26a01ae2b8d3eb"} Apr 16 18:30:46.291850 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.291849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"d39ad252f9d773aae86a714b43576a9d7131fa5ea8a19c8976d67f0f1258cbbd"} Apr 16 18:30:46.293234 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.293204 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="6be5c2530f1a97bbb060081a0cabfa5d81641866e643e7877ca337520a013125" exitCode=0 Apr 16 18:30:46.293331 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.293280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"6be5c2530f1a97bbb060081a0cabfa5d81641866e643e7877ca337520a013125"} Apr 16 18:30:46.294654 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.294637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9k6mz" event={"ID":"1c74f02e-39bc-4ee2-bd6c-07d23ece32a2","Type":"ContainerStarted","Data":"05d52f4f61e943846a2fedabf9c1857e60049f038559e456cb810424e96e0733"} Apr 16 18:30:46.295807 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.295790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rs8w8" event={"ID":"129c086c-bc70-4407-a43e-26664dfb816c","Type":"ContainerStarted","Data":"219a665e6fc616136bad68fa9cba8fd6110b3a7345cd2dab991f8ab88cb76aa9"} Apr 16 18:30:46.306166 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.306126 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w4gwm" podStartSLOduration=3.341553375 podStartE2EDuration="20.306113202s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.824381581 +0000 UTC m=+3.290936464" lastFinishedPulling="2026-04-16 18:30:45.78894141 +0000 UTC m=+20.255496291" observedRunningTime="2026-04-16 18:30:46.304451257 +0000 UTC m=+20.771006143" watchObservedRunningTime="2026-04-16 18:30:46.306113202 +0000 UTC m=+20.772668066" Apr 16 18:30:46.323794 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.322291 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9k6mz" podStartSLOduration=3.391753957 podStartE2EDuration="20.322224226s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.817664655 +0000 UTC m=+3.284219525" lastFinishedPulling="2026-04-16 18:30:45.748134918 +0000 UTC m=+20.214689794" observedRunningTime="2026-04-16 18:30:46.320926791 +0000 UTC m=+20.787481679" watchObservedRunningTime="2026-04-16 18:30:46.322224226 +0000 UTC m=+20.788779113" Apr 16 18:30:46.338106 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.338062 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j85qr" podStartSLOduration=3.407510965 podStartE2EDuration="20.338045971s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.817199947 +0000 UTC m=+3.283754815" lastFinishedPulling="2026-04-16 18:30:45.747734952 +0000 UTC m=+20.214289821" observedRunningTime="2026-04-16 18:30:46.337651104 +0000 UTC m=+20.804205991" watchObservedRunningTime="2026-04-16 18:30:46.338045971 +0000 UTC m=+20.804600857" Apr 16 18:30:46.353120 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.353076 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rs8w8" podStartSLOduration=3.39763221 podStartE2EDuration="20.353061978s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.792659245 +0000 UTC m=+3.259214112" lastFinishedPulling="2026-04-16 18:30:45.748089016 +0000 UTC m=+20.214643880" observedRunningTime="2026-04-16 18:30:46.3526672 +0000 UTC m=+20.819222087" watchObservedRunningTime="2026-04-16 18:30:46.353061978 +0000 UTC m=+20.819616863" Apr 16 18:30:46.378242 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.378215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:46.378804 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.378787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:46.406976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:46.406935 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ljxdq" podStartSLOduration=3.4497944289999998 podStartE2EDuration="20.406917544s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.79298668 +0000 UTC m=+3.259541543" lastFinishedPulling="2026-04-16 18:30:45.750109777 +0000 UTC m=+20.216664658" observedRunningTime="2026-04-16 18:30:46.406677618 +0000 UTC m=+20.873232501" watchObservedRunningTime="2026-04-16 18:30:46.406917544 +0000 UTC m=+20.873472430" Apr 16 18:30:47.010106 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.010071 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:47.010609 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.010590 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j85qr" Apr 16 18:30:47.072104 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.072080 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:30:47.114560 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.114454 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:30:47.072097666Z","UUID":"6a2b15a1-39d0-45be-a225-9c1981c1dbc4","Handler":null,"Name":"","Endpoint":""} Apr 16 18:30:47.116136 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.116113 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:30:47.116136 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.116138 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:30:47.301586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.301532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"4b39166c3081a7f10875ce200ccc85a44774b04b873c8421e044ce845ee61d22"} Apr 16 18:30:47.301586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.301572 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"23e1e974fa3c766de864a7c37a850bac1d647b3345fa65428c02135e09449885"} Apr 16 18:30:47.301586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.301582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"a9d2875eef4a52aa649c1b2170b90e0526c1f6463290881aa251690a9370c9cb"} Apr 16 18:30:47.303343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:47.303314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" event={"ID":"292f88d2-b6d0-4f0b-95e8-aad4c415fc43","Type":"ContainerStarted","Data":"15203d3239154af831df4de7ac5af1dd64dccf596d9d9a164424d0e2d1f443f9"} Apr 16 18:30:48.175685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:48.175657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:48.175821 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:48.175660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:48.175821 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:48.175771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:48.175937 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:48.175854 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:48.307170 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:48.307142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" event={"ID":"292f88d2-b6d0-4f0b-95e8-aad4c415fc43","Type":"ContainerStarted","Data":"44b479c0b17f89c05589a9d40eb1b52da3122275b110ee73470ee242f37f5c1e"} Apr 16 18:30:48.308980 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:48.308958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-d968n" event={"ID":"4c92cc7a-a8d8-4824-ac8c-b83aca2188a9","Type":"ContainerStarted","Data":"3c62394b2031594e491ec9ac9f3670b0374cbf2079e911be34300e984dee39f8"} Apr 16 18:30:48.326446 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:48.326372 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-d968n" podStartSLOduration=5.3672647510000004 podStartE2EDuration="22.32635432s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.788749662 +0000 UTC m=+3.255304546" lastFinishedPulling="2026-04-16 18:30:45.747839238 +0000 UTC m=+20.214394115" observedRunningTime="2026-04-16 18:30:48.325695747 +0000 UTC m=+22.792250635" watchObservedRunningTime="2026-04-16 18:30:48.32635432 +0000 UTC m=+22.792909188" Apr 16 18:30:49.314449 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.314413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"5f4a0edea16625a3187e9bac8ef359bfc06b4a1287db9723a2af55bfa951dd37"} Apr 16 18:30:49.743340 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.743226 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6njss" podStartSLOduration=4.407491575 podStartE2EDuration="23.743205539s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.824456998 +0000 UTC m=+3.291011878" lastFinishedPulling="2026-04-16 18:30:48.16017097 +0000 UTC m=+22.626725842" observedRunningTime="2026-04-16 18:30:49.34070835 +0000 UTC m=+23.807263236" watchObservedRunningTime="2026-04-16 18:30:49.743205539 +0000 UTC m=+24.209760407" Apr 16 18:30:49.743525 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.743437 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ggqxt"] Apr 16 18:30:49.766014 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.765982 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.766173 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:49.766076 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:49.828849 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.828816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-kubelet-config\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.829037 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.828898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-dbus\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.829037 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.829009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.930190 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.930144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-dbus\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.930358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.930207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.930358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.930239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-kubelet-config\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.930358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.930304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-kubelet-config\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:49.930358 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:49.930312 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:49.930574 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:49.930376 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret podName:8617aaa8-5382-49c3-9fbd-7f66b89d8525 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:50.430357363 +0000 UTC m=+24.896912228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret") pod "global-pull-secret-syncer-ggqxt" (UID: "8617aaa8-5382-49c3-9fbd-7f66b89d8525") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:49.930574 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:49.930387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8617aaa8-5382-49c3-9fbd-7f66b89d8525-dbus\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:50.175063 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:50.175012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:50.175063 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:50.175037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:50.175300 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:50.175154 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:50.175300 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:50.175281 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:50.435597 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:50.435505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:50.436010 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:50.435652 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:50.436010 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:50.435716 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret podName:8617aaa8-5382-49c3-9fbd-7f66b89d8525 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:51.435700956 +0000 UTC m=+25.902255820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret") pod "global-pull-secret-syncer-ggqxt" (UID: "8617aaa8-5382-49c3-9fbd-7f66b89d8525") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:51.176012 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.175833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:51.176163 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:51.176079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:51.319574 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.319537 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="8de50b41f4d5920839b91fd003065c00f8816969dac0d677a649fe562517f979" exitCode=0 Apr 16 18:30:51.319738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.319620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"8de50b41f4d5920839b91fd003065c00f8816969dac0d677a649fe562517f979"} Apr 16 18:30:51.322889 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.322864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" event={"ID":"08cb14f4-383f-4b43-8944-b2fe93cf6dff","Type":"ContainerStarted","Data":"908b3b1af1d2d5b12db559a55126e8b0db923f36ee0caed361c03950b21c6b4f"} Apr 16 18:30:51.323177 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.323163 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:51.338381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.338359 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:51.371458 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.371411 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" podStartSLOduration=8.190752741 podStartE2EDuration="25.371376689s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.817248912 +0000 UTC m=+3.283803790" lastFinishedPulling="2026-04-16 18:30:45.997872856 +0000 UTC m=+20.464427738" observedRunningTime="2026-04-16 18:30:51.370167582 +0000 UTC m=+25.836722467" watchObservedRunningTime="2026-04-16 18:30:51.371376689 +0000 UTC m=+25.837931574" Apr 16 18:30:51.443950 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:51.443837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:51.443950 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:51.443943 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:51.444790 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:51.443998 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret podName:8617aaa8-5382-49c3-9fbd-7f66b89d8525 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:53.443980513 +0000 UTC m=+27.910535390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret") pod "global-pull-secret-syncer-ggqxt" (UID: "8617aaa8-5382-49c3-9fbd-7f66b89d8525") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:52.175691 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:52.175653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:52.175830 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:52.175703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:52.175830 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:52.175787 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:52.175944 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:52.175918 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:52.324832 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:52.324792 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:30:52.325219 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:52.325198 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:52.338853 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:52.338830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:53.049689 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.049488 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbq69"] Apr 16 18:30:53.050264 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.049808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:53.050264 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:53.049901 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:53.050264 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.050149 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ggqxt"] Apr 16 18:30:53.050264 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.050251 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:53.050485 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:53.050350 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:53.051579 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.051558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n66hf"] Apr 16 18:30:53.051674 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.051646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:53.051751 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:53.051731 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:53.328238 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.328211 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="f96dc89bcb61f308882601bb62772ebd1c7b670c99e54944f3c69a0fdd6bfbaa" exitCode=0 Apr 16 18:30:53.328423 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.328295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"f96dc89bcb61f308882601bb62772ebd1c7b670c99e54944f3c69a0fdd6bfbaa"} Apr 16 18:30:53.328535 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.328520 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 18:30:53.459368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.459335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:53.459573 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:53.459549 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:53.459639 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:53.459620 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret podName:8617aaa8-5382-49c3-9fbd-7f66b89d8525 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.459598938 +0000 UTC m=+31.926153807 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret") pod "global-pull-secret-syncer-ggqxt" (UID: "8617aaa8-5382-49c3-9fbd-7f66b89d8525") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:53.956496 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:53.956449 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:30:55.175226 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.175148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:55.175827 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:55.175310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:55.175827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.175794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:55.176010 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:55.175988 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:55.176097 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.176045 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:55.176194 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:55.176163 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:55.334012 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.333983 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="d5d466eff527122f6f19e94de65864868b0e65395d4c75e281af2718896c32c5" exitCode=0 Apr 16 18:30:55.334141 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.334063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"d5d466eff527122f6f19e94de65864868b0e65395d4c75e281af2718896c32c5"} Apr 16 18:30:55.345742 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:55.345701 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" podUID="08cb14f4-383f-4b43-8944-b2fe93cf6dff" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 18:30:57.175323 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:57.175294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:57.175851 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:57.175292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:57.175851 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:57.175426 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:57.175851 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:57.175292 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:57.175851 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:57.175483 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:57.175851 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:57.175573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:57.493786 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:57.493696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:57.493956 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:57.493844 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:57.493956 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:57.493911 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret podName:8617aaa8-5382-49c3-9fbd-7f66b89d8525 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:05.493896994 +0000 UTC m=+39.960451858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret") pod "global-pull-secret-syncer-ggqxt" (UID: "8617aaa8-5382-49c3-9fbd-7f66b89d8525") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:59.175511 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.175475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:59.175511 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.175503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:30:59.175939 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.175482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:59.175939 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.175578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qbq69" podUID="8837a43b-32fb-45cb-9303-bc2b56966e5f" Apr 16 18:30:59.175939 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.175658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n66hf" podUID="e8425304-94d1-408f-ac22-f5bb6adfce75" Apr 16 18:30:59.175939 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.175725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ggqxt" podUID="8617aaa8-5382-49c3-9fbd-7f66b89d8525" Apr 16 18:30:59.807316 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.807280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:30:59.807537 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.807437 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:59.807537 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.807507 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs podName:e8425304-94d1-408f-ac22-f5bb6adfce75 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.807486847 +0000 UTC m=+66.274041718 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs") pod "network-metrics-daemon-n66hf" (UID: "e8425304-94d1-408f-ac22-f5bb6adfce75") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:59.851460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.851430 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-47.ec2.internal" event="NodeReady" Apr 16 18:30:59.851634 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.851566 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:30:59.887667 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.887635 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l"] Apr 16 18:30:59.908235 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.907692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:30:59.908235 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.907841 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:59.908235 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.907860 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:59.908235 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.907872 2576 projected.go:194] Error preparing data for projected volume kube-api-access-pncs2 for pod openshift-network-diagnostics/network-check-target-qbq69: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:59.908235 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:30:59.907922 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2 podName:8837a43b-32fb-45cb-9303-bc2b56966e5f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.90790522 +0000 UTC m=+66.374460086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-pncs2" (UniqueName: "kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2") pod "network-check-target-qbq69" (UID: "8837a43b-32fb-45cb-9303-bc2b56966e5f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:59.913144 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.913123 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:30:59.913291 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.913273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" Apr 16 18:30:59.918853 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.918831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:59.919021 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.918987 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:59.919354 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.919164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-npnsn\"" Apr 16 18:30:59.929769 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.929748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l"] Apr 16 18:30:59.929874 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.929779 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9ms5f"] Apr 16 18:30:59.929914 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.929886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:30:59.932772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.932311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:30:59.932772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.932340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-p5mxf\"" Apr 16 18:30:59.932934 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.932901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:30:59.933048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.933026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:30:59.948358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.948330 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:30:59.961267 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.961244 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-64bf8854b4-776ph"] Apr 16 18:30:59.982300 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.982272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-b7927"] Apr 16 18:30:59.982491 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.982310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:30:59.982491 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.982324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9ms5f" Apr 16 18:30:59.991041 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:30:59.991228 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:30:59.991317 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:30:59.991317 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:30:59.991544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991529 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:30:59.991634 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991620 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mnpbz\"" Apr 16 18:30:59.991698 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:30:59.991757 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:30:59.991882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.991868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hnsm4\"" Apr 16 18:30:59.992891 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:30:59.992870 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:31:00.003016 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.002998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn"] Apr 16 18:31:00.003164 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.003147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.005632 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.005612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:31:00.005726 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.005702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:31:00.006093 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.006072 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hg8kp\"" Apr 16 18:31:00.006549 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.006525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.007595 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.007575 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dntmv\" (UniqueName: \"kubernetes.io/projected/f927df6e-69e1-4c13-8409-28c80b811150-kube-api-access-dntmv\") pod \"volume-data-source-validator-7d955d5dd4-7nw2l\" (UID: \"f927df6e-69e1-4c13-8409-28c80b811150\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.008384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.008386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgtn\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.011800 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.011780 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:31:00.017240 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.017219 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh"] Apr 16 18:31:00.017420 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.017387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.020474 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.020453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:31:00.020609 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.020457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.020609 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.020461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.020609 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.020541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hwx2d\"" Apr 16 18:31:00.033822 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.033804 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc"] Apr 16 18:31:00.033965 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.033949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.038141 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.038120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-r4s88\"" Apr 16 18:31:00.038141 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.038130 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:31:00.038303 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.038159 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:31:00.053945 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.053922 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj"] Apr 16 18:31:00.054052 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.053964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.056319 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.056295 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:31:00.056454 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.056303 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:31:00.057245 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.057227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.057466 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.057424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-d56wl\"" Apr 16 18:31:00.058340 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.058322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.065816 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.065789 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7cbsl"] Apr 16 18:31:00.065978 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.065958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.068821 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.068803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:31:00.068907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.068810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.069338 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.069314 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.069338 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.069325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-lq8b2\"" Apr 16 18:31:00.069521 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.069353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:31:00.082367 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.082348 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g7zhh"] Apr 16 18:31:00.082521 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.082501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.085171 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.085151 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.085408 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.085374 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-44gxs\"" Apr 16 18:31:00.085558 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.085378 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:31:00.085624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.085606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.102930 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.102909 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj"] Apr 16 18:31:00.103112 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.103094 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.105680 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.105657 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-djbgb\"" Apr 16 18:31:00.105896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.105868 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:31:00.106064 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.106041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.106224 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.106207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:31:00.106732 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.106696 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.108832 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.108936 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.108936 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dntmv\" (UniqueName: \"kubernetes.io/projected/f927df6e-69e1-4c13-8409-28c80b811150-kube-api-access-dntmv\") pod \"volume-data-source-validator-7d955d5dd4-7nw2l\" (UID: \"f927df6e-69e1-4c13-8409-28c80b811150\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" Apr 16 18:31:00.108936 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-default-certificate\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.109103 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.108950 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:00.109103 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.109103 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.108974 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:00.109103 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.108994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-stats-auth\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.109103 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.109027 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.609009869 +0000 UTC m=+35.075564747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:00.109309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dft7g\" (UniqueName: \"kubernetes.io/projected/ec1cac8d-1583-4dd6-b5a5-d40689535353-kube-api-access-dft7g\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.109309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-kube-api-access-nqjdw\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.109309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.109309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcp9r\" (UniqueName: \"kubernetes.io/projected/765cda1d-eaf6-43b6-a926-4ad4fe965542-kube-api-access-jcp9r\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.109309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcc6daec-498a-4d51-950c-80666fb565da-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cef0db6d-a3ae-4198-8447-b4ee557da9d1-tmp-dir\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgtn\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-trusted-ca\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.109585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7stl\" (UniqueName: \"kubernetes.io/projected/cef0db6d-a3ae-4198-8447-b4ee557da9d1-kube-api-access-s7stl\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cef0db6d-a3ae-4198-8447-b4ee557da9d1-config-volume\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-config\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.110003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.109758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765cda1d-eaf6-43b6-a926-4ad4fe965542-serving-cert\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.110330 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.110045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.111157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.111138 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:31:00.111249 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.111178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.117588 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.114011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.117588 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.114021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.119224 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.119201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dntmv\" (UniqueName: \"kubernetes.io/projected/f927df6e-69e1-4c13-8409-28c80b811150-kube-api-access-dntmv\") pod \"volume-data-source-validator-7d955d5dd4-7nw2l\" (UID: \"f927df6e-69e1-4c13-8409-28c80b811150\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" Apr 16 18:31:00.119319 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.119279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.119533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.119447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgtn\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.124156 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.124135 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj"] Apr 16 18:31:00.124439 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.124415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" Apr 16 18:31:00.126836 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.126796 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.126836 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.126813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.127111 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.127094 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-plw9m\"" Apr 16 18:31:00.142166 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142146 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142174 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-b7927"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142201 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142215 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g7zhh"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142239 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64bf8854b4-776ph"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142257 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9ms5f"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142267 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cbsl"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142277 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj"] Apr 16 18:31:00.142279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj"] Apr 16 18:31:00.142752 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.142311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.144422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.144373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4dtft\"" Apr 16 18:31:00.144904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.144880 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:31:00.146144 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.146119 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:31:00.146144 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.146135 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:31:00.146144 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.146139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:31:00.210123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcc6daec-498a-4d51-950c-80666fb565da-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210140 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cef0db6d-a3ae-4198-8447-b4ee557da9d1-tmp-dir\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-config\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.210258 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.210326 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.710305833 +0000 UTC m=+35.176860713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7stl\" (UniqueName: \"kubernetes.io/projected/cef0db6d-a3ae-4198-8447-b4ee557da9d1-kube-api-access-s7stl\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-config\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-trusted-ca\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.210599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-serving-cert\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnrgf\" (UniqueName: \"kubernetes.io/projected/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-kube-api-access-rnrgf\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-stats-auth\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cef0db6d-a3ae-4198-8447-b4ee557da9d1-tmp-dir\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-tmp\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-snapshots\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdhk\" (UniqueName: \"kubernetes.io/projected/cee6ee12-c77c-4b90-a41c-75571be006dc-kube-api-access-9rdhk\") pod \"network-check-source-7b678d77c7-jxkbj\" (UID: \"cee6ee12-c77c-4b90-a41c-75571be006dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcp9r\" (UniqueName: \"kubernetes.io/projected/765cda1d-eaf6-43b6-a926-4ad4fe965542-kube-api-access-jcp9r\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2j49\" (UniqueName: \"kubernetes.io/projected/592a7c8f-97a7-4307-9682-3926fa559c11-kube-api-access-q2j49\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdxk\" (UniqueName: \"kubernetes.io/projected/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-kube-api-access-vgdxk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74fc\" (UniqueName: \"kubernetes.io/projected/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-kube-api-access-r74fc\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcc6daec-498a-4d51-950c-80666fb565da-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlqj\" (UniqueName: \"kubernetes.io/projected/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-kube-api-access-vqlqj\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.211148 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.210989 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.210991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211049 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.711030769 +0000 UTC m=+35.177585658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211088 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-serving-cert\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765cda1d-eaf6-43b6-a926-4ad4fe965542-serving-cert\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211142 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls podName:ec1cac8d-1583-4dd6-b5a5-d40689535353 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.711127225 +0000 UTC m=+35.177682092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls") pod "cluster-samples-operator-667775844f-k7whn" (UID: "ec1cac8d-1583-4dd6-b5a5-d40689535353") : secret "samples-operator-tls" not found Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211169 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.711159555 +0000 UTC m=+35.177714433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-config\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cef0db6d-a3ae-4198-8447-b4ee557da9d1-config-volume\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-kube-api-access-nqjdw\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-default-certificate\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/592a7c8f-97a7-4307-9682-3926fa559c11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/765cda1d-eaf6-43b6-a926-4ad4fe965542-trusted-ca\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.211879 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211662 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:00.212685 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.211723 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.711708717 +0000 UTC m=+35.178263582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:00.212685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.212685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dft7g\" (UniqueName: \"kubernetes.io/projected/ec1cac8d-1583-4dd6-b5a5-d40689535353-kube-api-access-dft7g\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.212685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.212685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.211844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cef0db6d-a3ae-4198-8447-b4ee557da9d1-config-volume\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.213702 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.213660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-stats-auth\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.214148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.214127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/765cda1d-eaf6-43b6-a926-4ad4fe965542-serving-cert\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.214263 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.214245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-default-certificate\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.224280 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.224261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-kube-api-access-nqjdw\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.224359 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.224344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dft7g\" (UniqueName: \"kubernetes.io/projected/ec1cac8d-1583-4dd6-b5a5-d40689535353-kube-api-access-dft7g\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.224474 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.224452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" Apr 16 18:31:00.224474 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.224475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcp9r\" (UniqueName: \"kubernetes.io/projected/765cda1d-eaf6-43b6-a926-4ad4fe965542-kube-api-access-jcp9r\") pod \"console-operator-d87b8d5fc-b7927\" (UID: \"765cda1d-eaf6-43b6-a926-4ad4fe965542\") " pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.225481 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.225366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7stl\" (UniqueName: \"kubernetes.io/projected/cef0db6d-a3ae-4198-8447-b4ee557da9d1-kube-api-access-s7stl\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.312659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnrgf\" (UniqueName: \"kubernetes.io/projected/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-kube-api-access-rnrgf\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.312659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-tmp\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.312659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-snapshots\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rdhk\" (UniqueName: \"kubernetes.io/projected/cee6ee12-c77c-4b90-a41c-75571be006dc-kube-api-access-9rdhk\") pod \"network-check-source-7b678d77c7-jxkbj\" (UID: \"cee6ee12-c77c-4b90-a41c-75571be006dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2j49\" (UniqueName: \"kubernetes.io/projected/592a7c8f-97a7-4307-9682-3926fa559c11-kube-api-access-q2j49\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdxk\" (UniqueName: \"kubernetes.io/projected/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-kube-api-access-vgdxk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r74fc\" (UniqueName: \"kubernetes.io/projected/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-kube-api-access-r74fc\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:00.312907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlqj\" (UniqueName: \"kubernetes.io/projected/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-kube-api-access-vqlqj\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.312960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-serving-cert\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/592a7c8f-97a7-4307-9682-3926fa559c11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.313168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-config\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.313422 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-snapshots\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.313437 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.313494 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.813473672 +0000 UTC m=+35.280028554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.313707 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:00.813690159 +0000 UTC m=+35.280245029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-tmp\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313908 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.313981 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.313986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-serving-cert\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.314909 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.314088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/592a7c8f-97a7-4307-9682-3926fa559c11-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.314909 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.314426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-config\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.316065 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.316036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.316163 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.316101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-serving-cert\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.316538 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.316519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-serving-cert\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.322151 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.322113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rdhk\" (UniqueName: \"kubernetes.io/projected/cee6ee12-c77c-4b90-a41c-75571be006dc-kube-api-access-9rdhk\") pod \"network-check-source-7b678d77c7-jxkbj\" (UID: \"cee6ee12-c77c-4b90-a41c-75571be006dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" Apr 16 18:31:00.322308 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.322287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74fc\" (UniqueName: \"kubernetes.io/projected/3fe5dd28-9069-4e1e-9331-ddd24da0b5f2-kube-api-access-r74fc\") pod \"insights-operator-5785d4fcdd-g7zhh\" (UID: \"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2\") " pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.326242 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.326216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdxk\" (UniqueName: \"kubernetes.io/projected/3e48aa88-413f-40b4-bf6a-2dc0acc72e3a-kube-api-access-vgdxk\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vw2xc\" (UID: \"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.326328 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.326266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2j49\" (UniqueName: \"kubernetes.io/projected/592a7c8f-97a7-4307-9682-3926fa559c11-kube-api-access-q2j49\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.326791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.326767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlqj\" (UniqueName: \"kubernetes.io/projected/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-kube-api-access-vqlqj\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.326882 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.326808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnrgf\" (UniqueName: \"kubernetes.io/projected/ace4cbeb-ecb8-4ffc-b087-db80889cc00f-kube-api-access-rnrgf\") pod \"service-ca-operator-69965bb79d-5qhhj\" (UID: \"ace4cbeb-ecb8-4ffc-b087-db80889cc00f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.365311 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.365266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" Apr 16 18:31:00.376141 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.376111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" Apr 16 18:31:00.430140 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.430103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" Apr 16 18:31:00.438360 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.438330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" Apr 16 18:31:00.616773 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.616502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:00.616924 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.616681 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:00.616924 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.616822 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:00.616924 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.616904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.616881077 +0000 UTC m=+36.083435985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:00.717749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.717716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.717943 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.717761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:00.717943 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.717788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:00.717943 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.717844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:00.717943 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717889 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:00.717943 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717910 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717963 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.717942961 +0000 UTC m=+36.184497830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717966 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717982 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls podName:ec1cac8d-1583-4dd6-b5a5-d40689535353 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.717971844 +0000 UTC m=+36.184526715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls") pod "cluster-samples-operator-667775844f-k7whn" (UID: "ec1cac8d-1583-4dd6-b5a5-d40689535353") : secret "samples-operator-tls" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.717896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.717995 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.718021 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.718001123 +0000 UTC m=+36.184556002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.718111 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.718099775 +0000 UTC m=+36.184654668 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:00.718200 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.718129 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.718119765 +0000 UTC m=+36.184674636 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:00.819366 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.819320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:00.819560 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:00.819387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:00.819560 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.819489 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:00.819560 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.819514 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:00.819560 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.819547 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.819529747 +0000 UTC m=+36.286084624 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:00.819747 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:00.819574 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.819556579 +0000 UTC m=+36.286111449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:01.147379 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.147346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-g7zhh"] Apr 16 18:31:01.150427 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.150026 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l"] Apr 16 18:31:01.153192 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.153168 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-b7927"] Apr 16 18:31:01.162187 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.162135 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc"] Apr 16 18:31:01.164826 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.164807 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj"] Apr 16 18:31:01.175073 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.175055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:31:01.175161 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.175055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:31:01.175237 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.175055 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:31:01.177795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.177647 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:01.177795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.177647 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ds8t7\"" Apr 16 18:31:01.177795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.177686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:31:01.177795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.177707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2j6pd\"" Apr 16 18:31:01.182296 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.182275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj"] Apr 16 18:31:01.234672 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.234640 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fe5dd28_9069_4e1e_9331_ddd24da0b5f2.slice/crio-7cf4b3c3c099305f4cfb40067a3b44760f72b14bf7344c40b542285a18d81043 WatchSource:0}: Error finding container 7cf4b3c3c099305f4cfb40067a3b44760f72b14bf7344c40b542285a18d81043: Status 404 returned error can't find the container with id 7cf4b3c3c099305f4cfb40067a3b44760f72b14bf7344c40b542285a18d81043 Apr 16 18:31:01.235599 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.234908 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf927df6e_69e1_4c13_8409_28c80b811150.slice/crio-0f114d67ceaf39c74076da4e6a5ccbd2084c59b964b5bf61cc5134ae1c08c95e WatchSource:0}: Error finding container 0f114d67ceaf39c74076da4e6a5ccbd2084c59b964b5bf61cc5134ae1c08c95e: Status 404 returned error can't find the container with id 0f114d67ceaf39c74076da4e6a5ccbd2084c59b964b5bf61cc5134ae1c08c95e Apr 16 18:31:01.235909 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.235778 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e48aa88_413f_40b4_bf6a_2dc0acc72e3a.slice/crio-24384736050c3e5cb036bc5193b56b35a57f419d6a8d770d32ab4fb1ac762ddf WatchSource:0}: Error finding container 24384736050c3e5cb036bc5193b56b35a57f419d6a8d770d32ab4fb1ac762ddf: Status 404 returned error can't find the container with id 24384736050c3e5cb036bc5193b56b35a57f419d6a8d770d32ab4fb1ac762ddf Apr 16 18:31:01.236563 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.236542 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765cda1d_eaf6_43b6_a926_4ad4fe965542.slice/crio-e5dd4a3176fd3c891f249a1fb0253bb333e2e3f6fdeda79e71279d648e8c7b8f WatchSource:0}: Error finding container e5dd4a3176fd3c891f249a1fb0253bb333e2e3f6fdeda79e71279d648e8c7b8f: Status 404 returned error can't find the container with id e5dd4a3176fd3c891f249a1fb0253bb333e2e3f6fdeda79e71279d648e8c7b8f Apr 16 18:31:01.238678 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.237743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podace4cbeb_ecb8_4ffc_b087_db80889cc00f.slice/crio-4eb0698c45083d1caa463523332728add464c8622052bd5ab45ec8d13835a39f WatchSource:0}: Error finding container 4eb0698c45083d1caa463523332728add464c8622052bd5ab45ec8d13835a39f: Status 404 returned error can't find the container with id 4eb0698c45083d1caa463523332728add464c8622052bd5ab45ec8d13835a39f Apr 16 18:31:01.238678 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:01.238579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee6ee12_c77c_4b90_a41c_75571be006dc.slice/crio-6c6c3cb281db7b1757192b007dd2b4ed3d3b3092ad4bfcf34723fda441c78805 WatchSource:0}: Error finding container 6c6c3cb281db7b1757192b007dd2b4ed3d3b3092ad4bfcf34723fda441c78805: Status 404 returned error can't find the container with id 6c6c3cb281db7b1757192b007dd2b4ed3d3b3092ad4bfcf34723fda441c78805 Apr 16 18:31:01.346537 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.346508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" event={"ID":"ace4cbeb-ecb8-4ffc-b087-db80889cc00f","Type":"ContainerStarted","Data":"4eb0698c45083d1caa463523332728add464c8622052bd5ab45ec8d13835a39f"} Apr 16 18:31:01.347972 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.347932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" event={"ID":"f927df6e-69e1-4c13-8409-28c80b811150","Type":"ContainerStarted","Data":"0f114d67ceaf39c74076da4e6a5ccbd2084c59b964b5bf61cc5134ae1c08c95e"} Apr 16 18:31:01.349084 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.349059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" event={"ID":"cee6ee12-c77c-4b90-a41c-75571be006dc","Type":"ContainerStarted","Data":"6c6c3cb281db7b1757192b007dd2b4ed3d3b3092ad4bfcf34723fda441c78805"} Apr 16 18:31:01.350028 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.350006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" event={"ID":"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2","Type":"ContainerStarted","Data":"7cf4b3c3c099305f4cfb40067a3b44760f72b14bf7344c40b542285a18d81043"} Apr 16 18:31:01.351044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.351024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" event={"ID":"765cda1d-eaf6-43b6-a926-4ad4fe965542","Type":"ContainerStarted","Data":"e5dd4a3176fd3c891f249a1fb0253bb333e2e3f6fdeda79e71279d648e8c7b8f"} Apr 16 18:31:01.352193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.352172 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" event={"ID":"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a","Type":"ContainerStarted","Data":"24384736050c3e5cb036bc5193b56b35a57f419d6a8d770d32ab4fb1ac762ddf"} Apr 16 18:31:01.630238 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.630069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:01.630361 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.630210 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:01.630361 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.630314 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:01.630455 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.630366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.630350461 +0000 UTC m=+38.096905325 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:01.731223 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.731129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:01.731223 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.731212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731274 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.731333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.731327063 +0000 UTC m=+38.197881946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731377 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.731417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731425 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731447 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.731432217 +0000 UTC m=+38.197987095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731475 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.731460965 +0000 UTC m=+38.198015836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731480 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.731502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:01.731508 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731516 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls podName:ec1cac8d-1583-4dd6-b5a5-d40689535353 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.731504537 +0000 UTC m=+38.198059400 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls") pod "cluster-samples-operator-667775844f-k7whn" (UID: "ec1cac8d-1583-4dd6-b5a5-d40689535353") : secret "samples-operator-tls" not found Apr 16 18:31:01.731956 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.731573 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.731562888 +0000 UTC m=+38.198117754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:01.832530 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.832497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:01.832697 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:01.832561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:01.832697 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.832672 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:01.832697 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.832689 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:01.832824 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.832743 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.832727685 +0000 UTC m=+38.299282548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:01.832824 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:01.832756 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:03.832750388 +0000 UTC m=+38.299305251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:02.363836 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:02.363801 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="1ea301cebbb954347d9b1d70620580a01b6d7affd56e60b9d65596de5d5c3072" exitCode=0 Apr 16 18:31:02.364410 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:02.363885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"1ea301cebbb954347d9b1d70620580a01b6d7affd56e60b9d65596de5d5c3072"} Apr 16 18:31:03.380520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.380469 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a54163-a62f-47da-993d-f3471a740635" containerID="1426fceb4e49fb0912d0edbb87b19a05be4b17a714ebabbb38f4650ddb5de63e" exitCode=0 Apr 16 18:31:03.380999 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.380538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerDied","Data":"1426fceb4e49fb0912d0edbb87b19a05be4b17a714ebabbb38f4650ddb5de63e"} Apr 16 18:31:03.651357 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.651264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:03.651551 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.651428 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:03.651551 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.651447 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:03.651551 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.651512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.6514899 +0000 UTC m=+42.118044781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.752863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.752922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.752957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.753009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753037 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753065 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.753070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.753089765 +0000 UTC m=+42.219644643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753133 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.753123583 +0000 UTC m=+42.219678457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753152 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753162 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls podName:ec1cac8d-1583-4dd6-b5a5-d40689535353 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.753153866 +0000 UTC m=+42.219708747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls") pod "cluster-samples-operator-667775844f-k7whn" (UID: "ec1cac8d-1583-4dd6-b5a5-d40689535353") : secret "samples-operator-tls" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753170 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753186 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.753175809 +0000 UTC m=+42.219730674 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:03.753292 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.753201 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.753194382 +0000 UTC m=+42.219749245 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:03.854699 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.854657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:03.854863 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.854836 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:03.854909 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:03.854878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:03.854909 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.854904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.854888684 +0000 UTC m=+42.321443552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:03.854977 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.854954 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:03.855013 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:03.854991 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:07.854980867 +0000 UTC m=+42.321535732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:05.571635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:05.571602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:31:05.575084 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:05.575063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8617aaa8-5382-49c3-9fbd-7f66b89d8525-original-pull-secret\") pod \"global-pull-secret-syncer-ggqxt\" (UID: \"8617aaa8-5382-49c3-9fbd-7f66b89d8525\") " pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:31:05.696515 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:05.696478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ggqxt" Apr 16 18:31:06.814740 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:06.814693 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ggqxt"] Apr 16 18:31:06.821951 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:06.821909 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8617aaa8_5382_49c3_9fbd_7f66b89d8525.slice/crio-3249a18cdafd397c847d2976dca83f65232ca3a6e98279322e0e5f6deb734b3a WatchSource:0}: Error finding container 3249a18cdafd397c847d2976dca83f65232ca3a6e98279322e0e5f6deb734b3a: Status 404 returned error can't find the container with id 3249a18cdafd397c847d2976dca83f65232ca3a6e98279322e0e5f6deb734b3a Apr 16 18:31:07.391151 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.390523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" event={"ID":"cee6ee12-c77c-4b90-a41c-75571be006dc","Type":"ContainerStarted","Data":"3b6dedc4aed955f65e60a0bc3a676a0e04ce2a7a00285dde3114fa16f5859c2b"} Apr 16 18:31:07.392804 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.392775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" event={"ID":"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2","Type":"ContainerStarted","Data":"695440e7264caad758cd23182306bd18762f6bb4131b18f819913594d9ca3831"} Apr 16 18:31:07.394849 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.394827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/0.log" Apr 16 18:31:07.394931 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.394904 2576 generic.go:358] "Generic (PLEG): container finished" podID="765cda1d-eaf6-43b6-a926-4ad4fe965542" containerID="9487070d697e31689f94f9d8db21cdab6ee8079f10200105c015b312dc4b90b7" exitCode=255 Apr 16 18:31:07.394990 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.394965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" event={"ID":"765cda1d-eaf6-43b6-a926-4ad4fe965542","Type":"ContainerDied","Data":"9487070d697e31689f94f9d8db21cdab6ee8079f10200105c015b312dc4b90b7"} Apr 16 18:31:07.395191 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.395170 2576 scope.go:117] "RemoveContainer" containerID="9487070d697e31689f94f9d8db21cdab6ee8079f10200105c015b312dc4b90b7" Apr 16 18:31:07.400602 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.400560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" event={"ID":"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a","Type":"ContainerStarted","Data":"a30b2bfc4ef91bfcd3de9ca09020f17589d6992aa34add619d6885fd81e47657"} Apr 16 18:31:07.402264 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.402243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ggqxt" event={"ID":"8617aaa8-5382-49c3-9fbd-7f66b89d8525","Type":"ContainerStarted","Data":"3249a18cdafd397c847d2976dca83f65232ca3a6e98279322e0e5f6deb734b3a"} Apr 16 18:31:07.407573 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.407526 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-jxkbj" podStartSLOduration=31.033326241 podStartE2EDuration="36.40751029s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.268540461 +0000 UTC m=+35.735095325" lastFinishedPulling="2026-04-16 18:31:06.642724508 +0000 UTC m=+41.109279374" observedRunningTime="2026-04-16 18:31:07.405846612 +0000 UTC m=+41.872401498" watchObservedRunningTime="2026-04-16 18:31:07.40751029 +0000 UTC m=+41.874065177" Apr 16 18:31:07.409148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.408965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" event={"ID":"f2a54163-a62f-47da-993d-f3471a740635","Type":"ContainerStarted","Data":"ae4c5417739b66024f013a743e4a816d505870739529dcf3a88ba4b316d1d7fc"} Apr 16 18:31:07.411283 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.411175 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" event={"ID":"ace4cbeb-ecb8-4ffc-b087-db80889cc00f","Type":"ContainerStarted","Data":"0232b20447cc55bc4f3b2c5d25ec7a6aeb83e1174deb5991b254a94f5e73422e"} Apr 16 18:31:07.416426 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.414987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" event={"ID":"f927df6e-69e1-4c13-8409-28c80b811150","Type":"ContainerStarted","Data":"8de2b05ee5cf8b40c90b3c6ace2d071e4b9c790737168ea15a4cc1bb74faf152"} Apr 16 18:31:07.445019 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.444748 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" podStartSLOduration=31.045311608 podStartE2EDuration="36.444730265s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.236833478 +0000 UTC m=+35.703388346" lastFinishedPulling="2026-04-16 18:31:06.636252134 +0000 UTC m=+41.102807003" observedRunningTime="2026-04-16 18:31:07.44273654 +0000 UTC m=+41.909291427" watchObservedRunningTime="2026-04-16 18:31:07.444730265 +0000 UTC m=+41.911285152" Apr 16 18:31:07.466983 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.465293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" podStartSLOduration=31.067359847 podStartE2EDuration="36.465276353s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.238029173 +0000 UTC m=+35.704584038" lastFinishedPulling="2026-04-16 18:31:06.635945672 +0000 UTC m=+41.102500544" observedRunningTime="2026-04-16 18:31:07.464808257 +0000 UTC m=+41.931363143" watchObservedRunningTime="2026-04-16 18:31:07.465276353 +0000 UTC m=+41.931831241" Apr 16 18:31:07.492066 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.491769 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ll6hq" podStartSLOduration=9.019471862 podStartE2EDuration="41.491749201s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:30:28.81724771 +0000 UTC m=+3.283802575" lastFinishedPulling="2026-04-16 18:31:01.28952505 +0000 UTC m=+35.756079914" observedRunningTime="2026-04-16 18:31:07.489974266 +0000 UTC m=+41.956529154" watchObservedRunningTime="2026-04-16 18:31:07.491749201 +0000 UTC m=+41.958304087" Apr 16 18:31:07.509565 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.509496 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-7nw2l" podStartSLOduration=31.107983272 podStartE2EDuration="36.509478683s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.237283324 +0000 UTC m=+35.703838203" lastFinishedPulling="2026-04-16 18:31:06.638778735 +0000 UTC m=+41.105333614" observedRunningTime="2026-04-16 18:31:07.508571327 +0000 UTC m=+41.975126225" watchObservedRunningTime="2026-04-16 18:31:07.509478683 +0000 UTC m=+41.976033575" Apr 16 18:31:07.530671 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.530166 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" podStartSLOduration=31.162837649 podStartE2EDuration="36.530149552s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.268696311 +0000 UTC m=+35.735251180" lastFinishedPulling="2026-04-16 18:31:06.636008203 +0000 UTC m=+41.102563083" observedRunningTime="2026-04-16 18:31:07.529059697 +0000 UTC m=+41.995614593" watchObservedRunningTime="2026-04-16 18:31:07.530149552 +0000 UTC m=+41.996704439" Apr 16 18:31:07.700216 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.699471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:07.700216 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.699708 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:07.700216 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.699727 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:07.700216 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.699791 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.699770938 +0000 UTC m=+50.166325826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.800690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.800801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.800835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.800857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.800886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.800991 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801037 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.801023419 +0000 UTC m=+50.267578283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801278 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.801254854 +0000 UTC m=+50.267809720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801302 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801365 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls podName:ec1cac8d-1583-4dd6-b5a5-d40689535353 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.801352221 +0000 UTC m=+50.267907088 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls") pod "cluster-samples-operator-667775844f-k7whn" (UID: "ec1cac8d-1583-4dd6-b5a5-d40689535353") : secret "samples-operator-tls" not found Apr 16 18:31:07.801425 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.801389864 +0000 UTC m=+50.267944731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:07.801900 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801450 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:07.801900 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.801494 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.801483213 +0000 UTC m=+50.268038077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:07.902000 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.901960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:07.902457 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:07.902090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:07.902457 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.902183 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:07.902457 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.902231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.902213684 +0000 UTC m=+50.368768551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:07.902654 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.902558 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:07.902654 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:07.902596 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:15.902586167 +0000 UTC m=+50.369141031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:08.420527 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.420499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:31:08.421148 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.421119 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/0.log" Apr 16 18:31:08.421258 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.421162 2576 generic.go:358] "Generic (PLEG): container finished" podID="765cda1d-eaf6-43b6-a926-4ad4fe965542" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" exitCode=255 Apr 16 18:31:08.422059 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.422037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" event={"ID":"765cda1d-eaf6-43b6-a926-4ad4fe965542","Type":"ContainerDied","Data":"755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8"} Apr 16 18:31:08.422200 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.422079 2576 scope.go:117] "RemoveContainer" containerID="9487070d697e31689f94f9d8db21cdab6ee8079f10200105c015b312dc4b90b7" Apr 16 18:31:08.422657 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.422623 2576 scope.go:117] "RemoveContainer" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" Apr 16 18:31:08.422836 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:08.422816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-b7927_openshift-console-operator(765cda1d-eaf6-43b6-a926-4ad4fe965542)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" podUID="765cda1d-eaf6-43b6-a926-4ad4fe965542" Apr 16 18:31:08.775236 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.775152 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm"] Apr 16 18:31:08.779838 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.779822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" Apr 16 18:31:08.789477 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.789458 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:31:08.789625 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.789589 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:31:08.789714 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.789698 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-8jg5t\"" Apr 16 18:31:08.797965 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.797943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm"] Apr 16 18:31:08.912363 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:08.912333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tsk\" (UniqueName: \"kubernetes.io/projected/1d6751a4-0f7f-4439-844e-5585a26c5f43-kube-api-access-g2tsk\") pod \"migrator-64d4d94569-bmcbm\" (UID: \"1d6751a4-0f7f-4439-844e-5585a26c5f43\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" Apr 16 18:31:09.013633 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.013590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tsk\" (UniqueName: \"kubernetes.io/projected/1d6751a4-0f7f-4439-844e-5585a26c5f43-kube-api-access-g2tsk\") pod \"migrator-64d4d94569-bmcbm\" (UID: \"1d6751a4-0f7f-4439-844e-5585a26c5f43\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" Apr 16 18:31:09.023206 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.023177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tsk\" (UniqueName: \"kubernetes.io/projected/1d6751a4-0f7f-4439-844e-5585a26c5f43-kube-api-access-g2tsk\") pod \"migrator-64d4d94569-bmcbm\" (UID: \"1d6751a4-0f7f-4439-844e-5585a26c5f43\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" Apr 16 18:31:09.089356 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.089327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" Apr 16 18:31:09.222968 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.222932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm"] Apr 16 18:31:09.227815 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:09.227784 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6751a4_0f7f_4439_844e_5585a26c5f43.slice/crio-bff99547e3f5c302ace644927840fb21f7085eee41c1bd0531dd06dd2235adb4 WatchSource:0}: Error finding container bff99547e3f5c302ace644927840fb21f7085eee41c1bd0531dd06dd2235adb4: Status 404 returned error can't find the container with id bff99547e3f5c302ace644927840fb21f7085eee41c1bd0531dd06dd2235adb4 Apr 16 18:31:09.426023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.425935 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:31:09.426369 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.426332 2576 scope.go:117] "RemoveContainer" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" Apr 16 18:31:09.426627 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:09.426598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-b7927_openshift-console-operator(765cda1d-eaf6-43b6-a926-4ad4fe965542)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" podUID="765cda1d-eaf6-43b6-a926-4ad4fe965542" Apr 16 18:31:09.427261 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.427241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" event={"ID":"1d6751a4-0f7f-4439-844e-5585a26c5f43","Type":"ContainerStarted","Data":"bff99547e3f5c302ace644927840fb21f7085eee41c1bd0531dd06dd2235adb4"} Apr 16 18:31:09.600810 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:09.600776 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rs8w8_129c086c-bc70-4407-a43e-26664dfb816c/dns-node-resolver/0.log" Apr 16 18:31:10.313254 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.313216 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:10.313727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.313382 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:10.400198 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.400161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9k6mz_1c74f02e-39bc-4ee2-bd6c-07d23ece32a2/node-ca/0.log" Apr 16 18:31:10.430817 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.430787 2576 scope.go:117] "RemoveContainer" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" Apr 16 18:31:10.431000 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:10.430973 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-b7927_openshift-console-operator(765cda1d-eaf6-43b6-a926-4ad4fe965542)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" podUID="765cda1d-eaf6-43b6-a926-4ad4fe965542" Apr 16 18:31:10.760511 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.760425 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-27x7n"] Apr 16 18:31:10.765193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.765165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.767616 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.767592 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:31:10.767750 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.767633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:31:10.768600 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.768557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:31:10.768723 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.768599 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:31:10.768723 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.768669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qgjtn\"" Apr 16 18:31:10.774819 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.774796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-27x7n"] Apr 16 18:31:10.833160 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.833125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-key\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.833299 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.833193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-cabundle\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.833384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.833370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79ljl\" (UniqueName: \"kubernetes.io/projected/0390c408-0354-460f-b4d9-03a1e5e0274e-kube-api-access-79ljl\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.934720 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.934695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79ljl\" (UniqueName: \"kubernetes.io/projected/0390c408-0354-460f-b4d9-03a1e5e0274e-kube-api-access-79ljl\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.934841 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.934741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-key\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.934952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.934910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-cabundle\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.935690 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.935668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-cabundle\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.937220 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.937197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0390c408-0354-460f-b4d9-03a1e5e0274e-signing-key\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:10.944629 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:10.944592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79ljl\" (UniqueName: \"kubernetes.io/projected/0390c408-0354-460f-b4d9-03a1e5e0274e-kube-api-access-79ljl\") pod \"service-ca-bfc587fb7-27x7n\" (UID: \"0390c408-0354-460f-b4d9-03a1e5e0274e\") " pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:11.076449 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.076386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" Apr 16 18:31:11.214266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.214233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-27x7n"] Apr 16 18:31:11.218572 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:11.218543 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0390c408_0354_460f_b4d9_03a1e5e0274e.slice/crio-eca3274937309b43e8b5dbdac038dab04d6228f8c5b008693bedac44e555c429 WatchSource:0}: Error finding container eca3274937309b43e8b5dbdac038dab04d6228f8c5b008693bedac44e555c429: Status 404 returned error can't find the container with id eca3274937309b43e8b5dbdac038dab04d6228f8c5b008693bedac44e555c429 Apr 16 18:31:11.435810 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.435728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ggqxt" event={"ID":"8617aaa8-5382-49c3-9fbd-7f66b89d8525","Type":"ContainerStarted","Data":"0347dc64f8fddf38c8b50f77e5d97389c6fc725db258febd55629c193e31b1ec"} Apr 16 18:31:11.437797 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.437770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" event={"ID":"0390c408-0354-460f-b4d9-03a1e5e0274e","Type":"ContainerStarted","Data":"9acdfb9de61c8262d941afb775b9eaca03d43a072943c27fc65fe270455223e9"} Apr 16 18:31:11.437921 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.437812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" event={"ID":"0390c408-0354-460f-b4d9-03a1e5e0274e","Type":"ContainerStarted","Data":"eca3274937309b43e8b5dbdac038dab04d6228f8c5b008693bedac44e555c429"} Apr 16 18:31:11.438560 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.438064 2576 scope.go:117] "RemoveContainer" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" Apr 16 18:31:11.438560 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:11.438232 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-b7927_openshift-console-operator(765cda1d-eaf6-43b6-a926-4ad4fe965542)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" podUID="765cda1d-eaf6-43b6-a926-4ad4fe965542" Apr 16 18:31:11.452768 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.452719 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ggqxt" podStartSLOduration=18.43997527 podStartE2EDuration="22.452702134s" podCreationTimestamp="2026-04-16 18:30:49 +0000 UTC" firstStartedPulling="2026-04-16 18:31:06.824508733 +0000 UTC m=+41.291063618" lastFinishedPulling="2026-04-16 18:31:10.837235616 +0000 UTC m=+45.303790482" observedRunningTime="2026-04-16 18:31:11.451809969 +0000 UTC m=+45.918364855" watchObservedRunningTime="2026-04-16 18:31:11.452702134 +0000 UTC m=+45.919257026" Apr 16 18:31:11.469815 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:11.469773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-27x7n" podStartSLOduration=1.469756974 podStartE2EDuration="1.469756974s" podCreationTimestamp="2026-04-16 18:31:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:11.469370782 +0000 UTC m=+45.935925684" watchObservedRunningTime="2026-04-16 18:31:11.469756974 +0000 UTC m=+45.936311860" Apr 16 18:31:12.442102 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:12.442064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" event={"ID":"1d6751a4-0f7f-4439-844e-5585a26c5f43","Type":"ContainerStarted","Data":"228f182ce42a5627c807e10a9d569fbf5bc0c29d77622b0bf885be5e73840621"} Apr 16 18:31:12.442584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:12.442109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" event={"ID":"1d6751a4-0f7f-4439-844e-5585a26c5f43","Type":"ContainerStarted","Data":"ccebd4c9c843eefbbe8d14c50bbad943978874c0b7f0c6ce4cf536f63fb2052a"} Apr 16 18:31:12.463067 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:12.463015 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bmcbm" podStartSLOduration=2.048738985 podStartE2EDuration="4.463002132s" podCreationTimestamp="2026-04-16 18:31:08 +0000 UTC" firstStartedPulling="2026-04-16 18:31:09.230109793 +0000 UTC m=+43.696664674" lastFinishedPulling="2026-04-16 18:31:11.644372941 +0000 UTC m=+46.110927821" observedRunningTime="2026-04-16 18:31:12.461931798 +0000 UTC m=+46.928486683" watchObservedRunningTime="2026-04-16 18:31:12.463002132 +0000 UTC m=+46.929557018" Apr 16 18:31:15.782073 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.782039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:15.782532 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.782197 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:31:15.782532 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.782215 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-55f7bf856d-b6qxj: secret "image-registry-tls" not found Apr 16 18:31:15.782532 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.782275 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls podName:9eafbaff-2bb8-4c09-a410-a5e054fefae3 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.782257223 +0000 UTC m=+66.248812103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls") pod "image-registry-55f7bf856d-b6qxj" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3") : secret "image-registry-tls" not found Apr 16 18:31:15.883125 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.883082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:15.883314 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.883203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:15.883314 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883228 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:15.883314 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls podName:cef0db6d-a3ae-4198-8447-b4ee557da9d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.883282888 +0000 UTC m=+66.349837752 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls") pod "dns-default-9ms5f" (UID: "cef0db6d-a3ae-4198-8447-b4ee557da9d1") : secret "dns-default-metrics-tls" not found Apr 16 18:31:15.883523 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883324 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:31:15.883523 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.883233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:15.883523 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.883363915 +0000 UTC m=+66.349918799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : secret "router-metrics-certs-default" not found Apr 16 18:31:15.883523 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.883419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:15.883523 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.883472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:15.883737 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883558 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:31:15.883737 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883578 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle podName:3e312f71-4f6a-4206-99c4-62f2f2ab84ef nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.883561585 +0000 UTC m=+66.350116449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle") pod "router-default-64bf8854b4-776ph" (UID: "3e312f71-4f6a-4206-99c4-62f2f2ab84ef") : configmap references non-existent config key: service-ca.crt Apr 16 18:31:15.883737 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.883600 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert podName:fcc6daec-498a-4d51-950c-80666fb565da nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.883589028 +0000 UTC m=+66.350143893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-t9jsh" (UID: "fcc6daec-498a-4d51-950c-80666fb565da") : secret "networking-console-plugin-cert" not found Apr 16 18:31:15.885891 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.885868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec1cac8d-1583-4dd6-b5a5-d40689535353-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-k7whn\" (UID: \"ec1cac8d-1583-4dd6-b5a5-d40689535353\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:15.927579 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.927552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" Apr 16 18:31:15.984390 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.984359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:15.984390 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:15.984433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:15.984390 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.984569 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:15.984852 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.984630 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls podName:592a7c8f-97a7-4307-9682-3926fa559c11 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.984603958 +0000 UTC m=+66.451158841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jsghj" (UID: "592a7c8f-97a7-4307-9682-3926fa559c11") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:31:15.984852 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.984667 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:15.984852 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:31:15.984705 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert podName:2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f nodeName:}" failed. No retries permitted until 2026-04-16 18:31:31.984693788 +0000 UTC m=+66.451248653 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert") pod "ingress-canary-7cbsl" (UID: "2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f") : secret "canary-serving-cert" not found Apr 16 18:31:16.076004 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:16.075974 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn"] Apr 16 18:31:16.455268 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:16.455191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" event={"ID":"ec1cac8d-1583-4dd6-b5a5-d40689535353","Type":"ContainerStarted","Data":"21365bb69bc34491f1a42c23b15d89de3a7263caa9a615efe4c3a5bad93f23d5"} Apr 16 18:31:19.464713 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:19.464678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" event={"ID":"ec1cac8d-1583-4dd6-b5a5-d40689535353","Type":"ContainerStarted","Data":"9dd427f0808c1a814b1f3d52f135f5b67948c11a52287b9abffe6401170aa419"} Apr 16 18:31:19.464713 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:19.464713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" event={"ID":"ec1cac8d-1583-4dd6-b5a5-d40689535353","Type":"ContainerStarted","Data":"8ce7164d5ebaac3cc30e43efe2797469da2e4e16fed144053d6af2a8de293c8e"} Apr 16 18:31:19.483387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:19.483337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-k7whn" podStartSLOduration=46.105646746 podStartE2EDuration="48.483321803s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:16.149464402 +0000 UTC m=+50.616019266" lastFinishedPulling="2026-04-16 18:31:18.527139441 +0000 UTC m=+52.993694323" observedRunningTime="2026-04-16 18:31:19.482136704 +0000 UTC m=+53.948691601" watchObservedRunningTime="2026-04-16 18:31:19.483321803 +0000 UTC m=+53.949876686" Apr 16 18:31:22.175266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.175238 2576 scope.go:117] "RemoveContainer" containerID="755c3fce3d17e1d4ef55a1e04783b3e4cfee7ce0e98c344c48dc84a441ab6ef8" Apr 16 18:31:22.473866 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.473794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:31:22.474002 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.473888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" event={"ID":"765cda1d-eaf6-43b6-a926-4ad4fe965542","Type":"ContainerStarted","Data":"5bf71fb300cb33ffa5bba57b5e7cdf4c973fdb8d7b0d01a1cc5a93ff81f54f51"} Apr 16 18:31:22.474188 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.474169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:22.494007 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.493967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" podStartSLOduration=46.127853456 podStartE2EDuration="51.493955142s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:01.268545044 +0000 UTC m=+35.735099909" lastFinishedPulling="2026-04-16 18:31:06.634646717 +0000 UTC m=+41.101201595" observedRunningTime="2026-04-16 18:31:22.492803787 +0000 UTC m=+56.959358672" watchObservedRunningTime="2026-04-16 18:31:22.493955142 +0000 UTC m=+56.960510028" Apr 16 18:31:22.958477 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:22.958449 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-b7927" Apr 16 18:31:25.345140 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:25.345112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tchmw" Apr 16 18:31:31.822531 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.822486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:31.822907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.822579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:31:31.825007 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.824983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"image-registry-55f7bf856d-b6qxj\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:31.825410 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.825376 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:31.835706 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.835683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8425304-94d1-408f-ac22-f5bb6adfce75-metrics-certs\") pod \"network-metrics-daemon-n66hf\" (UID: \"e8425304-94d1-408f-ac22-f5bb6adfce75\") " pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:31:31.923940 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.923903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:31.924114 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.923954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:31.924114 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.924001 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:31:31.924114 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.924023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:31.924114 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.924046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:31.924740 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.924707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-service-ca-bundle\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:31.926671 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.926640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3e312f71-4f6a-4206-99c4-62f2f2ab84ef-metrics-certs\") pod \"router-default-64bf8854b4-776ph\" (UID: \"3e312f71-4f6a-4206-99c4-62f2f2ab84ef\") " pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:31.926827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.926805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pncs2\" (UniqueName: \"kubernetes.io/projected/8837a43b-32fb-45cb-9303-bc2b56966e5f-kube-api-access-pncs2\") pod \"network-check-target-qbq69\" (UID: \"8837a43b-32fb-45cb-9303-bc2b56966e5f\") " pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:31:31.926888 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.926874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cef0db6d-a3ae-4198-8447-b4ee557da9d1-metrics-tls\") pod \"dns-default-9ms5f\" (UID: \"cef0db6d-a3ae-4198-8447-b4ee557da9d1\") " pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:31.927068 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:31.927045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fcc6daec-498a-4d51-950c-80666fb565da-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-t9jsh\" (UID: \"fcc6daec-498a-4d51-950c-80666fb565da\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:32.024551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.024513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:32.024711 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.024595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:32.027179 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.027157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f-cert\") pod \"ingress-canary-7cbsl\" (UID: \"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f\") " pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:32.027620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.027602 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/592a7c8f-97a7-4307-9682-3926fa559c11-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jsghj\" (UID: \"592a7c8f-97a7-4307-9682-3926fa559c11\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:32.056042 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.056020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-p5mxf\"" Apr 16 18:31:32.063241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.063222 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:32.088626 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.088562 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-ds8t7\"" Apr 16 18:31:32.093683 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.093661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2j6pd\"" Apr 16 18:31:32.095330 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.095316 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hnsm4\"" Apr 16 18:31:32.096825 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.096812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:31:32.100841 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.100820 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mnpbz\"" Apr 16 18:31:32.101885 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.101862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n66hf" Apr 16 18:31:32.104204 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.104176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:32.109311 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.108854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:32.145702 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.145663 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-r4s88\"" Apr 16 18:31:32.155431 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.154908 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" Apr 16 18:31:32.198471 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.198204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-44gxs\"" Apr 16 18:31:32.208100 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.205266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7cbsl" Apr 16 18:31:32.237957 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.237903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:31:32.275755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.265859 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4dtft\"" Apr 16 18:31:32.275755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.271729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" Apr 16 18:31:32.308282 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.304986 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qbq69"] Apr 16 18:31:32.411527 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.411505 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph"] Apr 16 18:31:32.419809 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.415201 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.419809 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.418647 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt"] Apr 16 18:31:32.426761 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.423523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:31:32.426761 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.424338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:31:32.426761 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.424989 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:31:32.432469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.430680 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq"] Apr 16 18:31:32.432469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.430755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 18:31:32.432469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.430897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.440628 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.440026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.440628 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.440327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 18:31:32.441647 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.440841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph"] Apr 16 18:31:32.442155 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.442129 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc6daec_498a_4d51_950c_80666fb565da.slice/crio-974fb6ca2a253d51a74126f0ec8dde71b91d28431c67d68abb88929a9af4fd7b WatchSource:0}: Error finding container 974fb6ca2a253d51a74126f0ec8dde71b91d28431c67d68abb88929a9af4fd7b: Status 404 returned error can't find the container with id 974fb6ca2a253d51a74126f0ec8dde71b91d28431c67d68abb88929a9af4fd7b Apr 16 18:31:32.442778 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.442382 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:31:32.444093 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.442497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gjf2s\"" Apr 16 18:31:32.444308 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.442702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh"] Apr 16 18:31:32.444448 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.444432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt"] Apr 16 18:31:32.444562 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.443015 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:31:32.444736 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.442869 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:31:32.444878 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.443059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:31:32.463927 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.463900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq"] Apr 16 18:31:32.469883 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.469857 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7cbsl"] Apr 16 18:31:32.470946 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.470920 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdf3a87_71c1_4f98_8f1f_f3ebbbbf916f.slice/crio-ae919ef7b67d70dbfd1421f9f498923b2028502415033ae4896e081c83ab669a WatchSource:0}: Error finding container ae919ef7b67d70dbfd1421f9f498923b2028502415033ae4896e081c83ab669a: Status 404 returned error can't find the container with id ae919ef7b67d70dbfd1421f9f498923b2028502415033ae4896e081c83ab669a Apr 16 18:31:32.506973 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.506938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cbsl" event={"ID":"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f","Type":"ContainerStarted","Data":"ae919ef7b67d70dbfd1421f9f498923b2028502415033ae4896e081c83ab669a"} Apr 16 18:31:32.508138 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.508112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" event={"ID":"fcc6daec-498a-4d51-950c-80666fb565da","Type":"ContainerStarted","Data":"974fb6ca2a253d51a74126f0ec8dde71b91d28431c67d68abb88929a9af4fd7b"} Apr 16 18:31:32.509449 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.509429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbq69" event={"ID":"8837a43b-32fb-45cb-9303-bc2b56966e5f","Type":"ContainerStarted","Data":"e699b4ded0e5c3ca34ed719ce1a8e3ec60dbfb8b8081d2e090b12faefb7d3a4c"} Apr 16 18:31:32.509551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.509460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qbq69" event={"ID":"8837a43b-32fb-45cb-9303-bc2b56966e5f","Type":"ContainerStarted","Data":"22d62f05919c3f3f4f23a9804e3de248980dd56e881295aaafcc62748aa0ede7"} Apr 16 18:31:32.509620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.509591 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:31:32.510975 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.510951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" event={"ID":"9eafbaff-2bb8-4c09-a410-a5e054fefae3","Type":"ContainerStarted","Data":"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630"} Apr 16 18:31:32.511058 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.510983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" event={"ID":"9eafbaff-2bb8-4c09-a410-a5e054fefae3","Type":"ContainerStarted","Data":"22222db35e07ea8111178aeb10706bbe7b1c99ae13de662868ec37738ff9dcd2"} Apr 16 18:31:32.511181 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.511169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:32.529005 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.528966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7ght\" (UniqueName: \"kubernetes.io/projected/8c623239-b167-4d2f-b1dd-dc719fb94abd-kube-api-access-l7ght\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.529005 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.528999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c623239-b167-4d2f-b1dd-dc719fb94abd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-tmp\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4br\" (UniqueName: \"kubernetes.io/projected/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-kube-api-access-sb4br\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.529193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.529469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcr9\" (UniqueName: \"kubernetes.io/projected/cd3125e1-4d29-47f7-8cce-daec4c138799-kube-api-access-gqcr9\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.529469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cd3125e1-4d29-47f7-8cce-daec4c138799-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.529469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.529284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.543320 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.543291 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pjfc2"] Apr 16 18:31:32.547688 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.547666 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.559718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.559693 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj"] Apr 16 18:31:32.563567 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.563543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:31:32.563680 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.563548 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:31:32.563817 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.563799 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sjnps\"" Apr 16 18:31:32.570029 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.570003 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod592a7c8f_97a7_4307_9682_3926fa559c11.slice/crio-eb59183b1547441a3ed8782135b9d4286766f7a6f523c28e60401f30a40ba6ab WatchSource:0}: Error finding container eb59183b1547441a3ed8782135b9d4286766f7a6f523c28e60401f30a40ba6ab: Status 404 returned error can't find the container with id eb59183b1547441a3ed8782135b9d4286766f7a6f523c28e60401f30a40ba6ab Apr 16 18:31:32.586756 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.586729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pjfc2"] Apr 16 18:31:32.629805 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.629953 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7ght\" (UniqueName: \"kubernetes.io/projected/8c623239-b167-4d2f-b1dd-dc719fb94abd-kube-api-access-l7ght\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.629953 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/671bae0e-2470-403d-b4f3-7c607959438a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.629953 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.629953 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c623239-b167-4d2f-b1dd-dc719fb94abd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.629999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-tmp\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/671bae0e-2470-403d-b4f3-7c607959438a-crio-socket\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/671bae0e-2470-403d-b4f3-7c607959438a-data-volume\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/671bae0e-2470-403d-b4f3-7c607959438a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.630196 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4br\" (UniqueName: \"kubernetes.io/projected/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-kube-api-access-sb4br\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.630555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.630555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.630555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcr9\" (UniqueName: \"kubernetes.io/projected/cd3125e1-4d29-47f7-8cce-daec4c138799-kube-api-access-gqcr9\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.630555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bftc\" (UniqueName: \"kubernetes.io/projected/671bae0e-2470-403d-b4f3-7c607959438a-kube-api-access-7bftc\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.630555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.630348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cd3125e1-4d29-47f7-8cce-daec4c138799-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.631173 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.631146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/cd3125e1-4d29-47f7-8cce-daec4c138799-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.632714 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.632666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-tmp\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.634116 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.634089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.634440 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.634370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-ca\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.635550 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.635526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.635704 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.635688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.635769 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.635732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c623239-b167-4d2f-b1dd-dc719fb94abd-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.635868 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.635842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd3125e1-4d29-47f7-8cce-daec4c138799-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.642302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.642262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n66hf"] Apr 16 18:31:32.645280 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.645259 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8425304_94d1_408f_ac22_f5bb6adfce75.slice/crio-f25e259dc6ffc9b1ee7f5eba120c48793c2b3508967ff164ed72487ac8369baa WatchSource:0}: Error finding container f25e259dc6ffc9b1ee7f5eba120c48793c2b3508967ff164ed72487ac8369baa: Status 404 returned error can't find the container with id f25e259dc6ffc9b1ee7f5eba120c48793c2b3508967ff164ed72487ac8369baa Apr 16 18:31:32.649432 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.649379 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcef0db6d_a3ae_4198_8447_b4ee557da9d1.slice/crio-78aad4302d0cdb4612a752800eb04a0f240ffad0274f1b3e79037e8ed008a5c7 WatchSource:0}: Error finding container 78aad4302d0cdb4612a752800eb04a0f240ffad0274f1b3e79037e8ed008a5c7: Status 404 returned error can't find the container with id 78aad4302d0cdb4612a752800eb04a0f240ffad0274f1b3e79037e8ed008a5c7 Apr 16 18:31:32.652220 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.652194 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-64bf8854b4-776ph"] Apr 16 18:31:32.653318 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.653299 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9ms5f"] Apr 16 18:31:32.664218 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.664191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7ght\" (UniqueName: \"kubernetes.io/projected/8c623239-b167-4d2f-b1dd-dc719fb94abd-kube-api-access-l7ght\") pod \"managed-serviceaccount-addon-agent-687769b44f-9dlmt\" (UID: \"8c623239-b167-4d2f-b1dd-dc719fb94abd\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.664335 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.664195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4br\" (UniqueName: \"kubernetes.io/projected/f3acfd61-794c-4fa1-b8bf-c6589e0c79eb-kube-api-access-sb4br\") pod \"klusterlet-addon-workmgr-6cb9f6d684-srnph\" (UID: \"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.664631 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.664615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcr9\" (UniqueName: \"kubernetes.io/projected/cd3125e1-4d29-47f7-8cce-daec4c138799-kube-api-access-gqcr9\") pod \"cluster-proxy-proxy-agent-695f4fd687-4fzwq\" (UID: \"cd3125e1-4d29-47f7-8cce-daec4c138799\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.672640 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.672620 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-fsthh"] Apr 16 18:31:32.677043 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.677016 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:32.680754 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.680732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:31:32.680894 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.680870 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:31:32.681033 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.681013 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5b26l\"" Apr 16 18:31:32.691685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.691663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-fsthh"] Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.731972 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/671bae0e-2470-403d-b4f3-7c607959438a-crio-socket\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/671bae0e-2470-403d-b4f3-7c607959438a-data-volume\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/671bae0e-2470-403d-b4f3-7c607959438a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/671bae0e-2470-403d-b4f3-7c607959438a-crio-socket\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bftc\" (UniqueName: \"kubernetes.io/projected/671bae0e-2470-403d-b4f3-7c607959438a-kube-api-access-7bftc\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732255 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/671bae0e-2470-403d-b4f3-7c607959438a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732766 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/671bae0e-2470-403d-b4f3-7c607959438a-data-volume\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.732911 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.732888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/671bae0e-2470-403d-b4f3-7c607959438a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.735325 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.735301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/671bae0e-2470-403d-b4f3-7c607959438a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.744777 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.744755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:32.751767 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.751717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" podStartSLOduration=66.751698862 podStartE2EDuration="1m6.751698862s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:32.720428188 +0000 UTC m=+67.186983075" watchObservedRunningTime="2026-04-16 18:31:32.751698862 +0000 UTC m=+67.218253749" Apr 16 18:31:32.751967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.751941 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qbq69" podStartSLOduration=66.751932351 podStartE2EDuration="1m6.751932351s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:32.750590478 +0000 UTC m=+67.217145364" watchObservedRunningTime="2026-04-16 18:31:32.751932351 +0000 UTC m=+67.218487239" Apr 16 18:31:32.753848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.753822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bftc\" (UniqueName: \"kubernetes.io/projected/671bae0e-2470-403d-b4f3-7c607959438a-kube-api-access-7bftc\") pod \"insights-runtime-extractor-pjfc2\" (UID: \"671bae0e-2470-403d-b4f3-7c607959438a\") " pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.765121 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.765093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" Apr 16 18:31:32.773586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.772971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" Apr 16 18:31:32.832844 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.832714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dgc\" (UniqueName: \"kubernetes.io/projected/b4b5ffce-2f92-4b13-b96b-d7fa243d1a13-kube-api-access-s5dgc\") pod \"downloads-586b57c7b4-fsthh\" (UID: \"b4b5ffce-2f92-4b13-b96b-d7fa243d1a13\") " pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:32.874158 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.873803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pjfc2" Apr 16 18:31:32.937752 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.937712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dgc\" (UniqueName: \"kubernetes.io/projected/b4b5ffce-2f92-4b13-b96b-d7fa243d1a13-kube-api-access-s5dgc\") pod \"downloads-586b57c7b4-fsthh\" (UID: \"b4b5ffce-2f92-4b13-b96b-d7fa243d1a13\") " pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:32.944369 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.943458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph"] Apr 16 18:31:32.947522 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.947438 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3acfd61_794c_4fa1_b8bf_c6589e0c79eb.slice/crio-e8b8ebafd251dd3e306ac18558924cc7ce2efd1a31d558594418d3e4b8e124ca WatchSource:0}: Error finding container e8b8ebafd251dd3e306ac18558924cc7ce2efd1a31d558594418d3e4b8e124ca: Status 404 returned error can't find the container with id e8b8ebafd251dd3e306ac18558924cc7ce2efd1a31d558594418d3e4b8e124ca Apr 16 18:31:32.950587 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.950563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dgc\" (UniqueName: \"kubernetes.io/projected/b4b5ffce-2f92-4b13-b96b-d7fa243d1a13-kube-api-access-s5dgc\") pod \"downloads-586b57c7b4-fsthh\" (UID: \"b4b5ffce-2f92-4b13-b96b-d7fa243d1a13\") " pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:32.963615 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.963576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt"] Apr 16 18:31:32.968799 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.968755 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c623239_b167_4d2f_b1dd_dc719fb94abd.slice/crio-2b886bedb462f30d024607809c3316a660f9d76f9d3ed9bf78d5772ef2044969 WatchSource:0}: Error finding container 2b886bedb462f30d024607809c3316a660f9d76f9d3ed9bf78d5772ef2044969: Status 404 returned error can't find the container with id 2b886bedb462f30d024607809c3316a660f9d76f9d3ed9bf78d5772ef2044969 Apr 16 18:31:32.983729 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.983702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq"] Apr 16 18:31:32.988552 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:32.988514 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd3125e1_4d29_47f7_8cce_daec4c138799.slice/crio-0aebeb15760ef264d922408d7307fcc4fcb7e6346d511a7f6103e168a4fb9256 WatchSource:0}: Error finding container 0aebeb15760ef264d922408d7307fcc4fcb7e6346d511a7f6103e168a4fb9256: Status 404 returned error can't find the container with id 0aebeb15760ef264d922408d7307fcc4fcb7e6346d511a7f6103e168a4fb9256 Apr 16 18:31:32.999563 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:32.999507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:33.060098 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.060040 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pjfc2"] Apr 16 18:31:33.067461 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:33.067262 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671bae0e_2470_403d_b4f3_7c607959438a.slice/crio-1036a2362663fe5a594624683eb3260fb15fb8567300e3291645c9c4cffe0d04 WatchSource:0}: Error finding container 1036a2362663fe5a594624683eb3260fb15fb8567300e3291645c9c4cffe0d04: Status 404 returned error can't find the container with id 1036a2362663fe5a594624683eb3260fb15fb8567300e3291645c9c4cffe0d04 Apr 16 18:31:33.175156 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.175119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-fsthh"] Apr 16 18:31:33.178649 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:33.178616 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b5ffce_2f92_4b13_b96b_d7fa243d1a13.slice/crio-d82c589d693cbc1aed1f81ce07f876cfd19a540bf52d38ce7dec26a426f67a15 WatchSource:0}: Error finding container d82c589d693cbc1aed1f81ce07f876cfd19a540bf52d38ce7dec26a426f67a15: Status 404 returned error can't find the container with id d82c589d693cbc1aed1f81ce07f876cfd19a540bf52d38ce7dec26a426f67a15 Apr 16 18:31:33.523939 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.523789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-fsthh" event={"ID":"b4b5ffce-2f92-4b13-b96b-d7fa243d1a13","Type":"ContainerStarted","Data":"d82c589d693cbc1aed1f81ce07f876cfd19a540bf52d38ce7dec26a426f67a15"} Apr 16 18:31:33.525957 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.525907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" event={"ID":"8c623239-b167-4d2f-b1dd-dc719fb94abd","Type":"ContainerStarted","Data":"2b886bedb462f30d024607809c3316a660f9d76f9d3ed9bf78d5772ef2044969"} Apr 16 18:31:33.527938 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.527910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64bf8854b4-776ph" event={"ID":"3e312f71-4f6a-4206-99c4-62f2f2ab84ef","Type":"ContainerStarted","Data":"feceae508285d2c94cf2156e7a178ff0477656b655620e5cb05bfdff6653bc4e"} Apr 16 18:31:33.528047 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.527941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-64bf8854b4-776ph" event={"ID":"3e312f71-4f6a-4206-99c4-62f2f2ab84ef","Type":"ContainerStarted","Data":"1851bfc506dd87eea5f0c0adf05de301c1d496dad2d50d33cf2e99750ea52b7e"} Apr 16 18:31:33.531433 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.531367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n66hf" event={"ID":"e8425304-94d1-408f-ac22-f5bb6adfce75","Type":"ContainerStarted","Data":"f25e259dc6ffc9b1ee7f5eba120c48793c2b3508967ff164ed72487ac8369baa"} Apr 16 18:31:33.533997 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.533974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pjfc2" event={"ID":"671bae0e-2470-403d-b4f3-7c607959438a","Type":"ContainerStarted","Data":"ec0783ef383d8951d7a2f974a8fa62dde8803325471bd88c7da943bea33893c9"} Apr 16 18:31:33.534106 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.534006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pjfc2" event={"ID":"671bae0e-2470-403d-b4f3-7c607959438a","Type":"ContainerStarted","Data":"1036a2362663fe5a594624683eb3260fb15fb8567300e3291645c9c4cffe0d04"} Apr 16 18:31:33.535945 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.535914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" event={"ID":"cd3125e1-4d29-47f7-8cce-daec4c138799","Type":"ContainerStarted","Data":"0aebeb15760ef264d922408d7307fcc4fcb7e6346d511a7f6103e168a4fb9256"} Apr 16 18:31:33.538090 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.538025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" event={"ID":"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb","Type":"ContainerStarted","Data":"e8b8ebafd251dd3e306ac18558924cc7ce2efd1a31d558594418d3e4b8e124ca"} Apr 16 18:31:33.540446 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.540414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" event={"ID":"592a7c8f-97a7-4307-9682-3926fa559c11","Type":"ContainerStarted","Data":"eb59183b1547441a3ed8782135b9d4286766f7a6f523c28e60401f30a40ba6ab"} Apr 16 18:31:33.542828 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.542799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ms5f" event={"ID":"cef0db6d-a3ae-4198-8447-b4ee557da9d1","Type":"ContainerStarted","Data":"78aad4302d0cdb4612a752800eb04a0f240ffad0274f1b3e79037e8ed008a5c7"} Apr 16 18:31:33.556138 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:33.555042 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-64bf8854b4-776ph" podStartSLOduration=62.555026764 podStartE2EDuration="1m2.555026764s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:31:33.553555445 +0000 UTC m=+68.020110327" watchObservedRunningTime="2026-04-16 18:31:33.555026764 +0000 UTC m=+68.021581652" Apr 16 18:31:34.105585 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:34.105522 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:34.108704 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:34.108490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:34.550869 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:34.550783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:34.553307 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:34.553097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-64bf8854b4-776ph" Apr 16 18:31:41.579780 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.579740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pjfc2" event={"ID":"671bae0e-2470-403d-b4f3-7c607959438a","Type":"ContainerStarted","Data":"3b216f039a2f003de8a72f7572964877443a224d585d0a8ed5fcdb86664c17ce"} Apr 16 18:31:41.581692 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.581657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" event={"ID":"cd3125e1-4d29-47f7-8cce-daec4c138799","Type":"ContainerStarted","Data":"c547610478139426cbf8e3a050bf340bd79accf985f59a8c12e6c6e0fe06a6ec"} Apr 16 18:31:41.583133 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.583107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" event={"ID":"f3acfd61-794c-4fa1-b8bf-c6589e0c79eb","Type":"ContainerStarted","Data":"c5e1df82452cdc7b518c66b09580e1daf4f62e0e6766d9f48224bd1b802ff203"} Apr 16 18:31:41.583507 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.583483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:41.585086 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.584950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" event={"ID":"592a7c8f-97a7-4307-9682-3926fa559c11","Type":"ContainerStarted","Data":"18da7036c30b45c39fa72a148294e36d3c5567de8b959435711fe41a032aa6bb"} Apr 16 18:31:41.585667 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.585646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" Apr 16 18:31:41.587292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.587271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ms5f" event={"ID":"cef0db6d-a3ae-4198-8447-b4ee557da9d1","Type":"ContainerStarted","Data":"4a52dc211e67f6f8fa082df61c4f82eb0de4ee289689b339765aba653b73057d"} Apr 16 18:31:41.587432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.587299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9ms5f" event={"ID":"cef0db6d-a3ae-4198-8447-b4ee557da9d1","Type":"ContainerStarted","Data":"4711c464ad79cc9061d68f4b5958d2aaa466816f3799e20930ce6a6b445d078e"} Apr 16 18:31:41.587432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.587422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:41.589005 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.588985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7cbsl" event={"ID":"2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f","Type":"ContainerStarted","Data":"3ffa33e4ada77a5957da991c2c9bc3df178298e550436130a1a766ad0bea978c"} Apr 16 18:31:41.590625 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.590589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" event={"ID":"8c623239-b167-4d2f-b1dd-dc719fb94abd","Type":"ContainerStarted","Data":"2335a39b8a698bbfb960b3968e39caf9295cbb0562ff3b971635eff8977034df"} Apr 16 18:31:41.592425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.592371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n66hf" event={"ID":"e8425304-94d1-408f-ac22-f5bb6adfce75","Type":"ContainerStarted","Data":"8e48aa8d997e413b21ebab196b8a0ca08e960d845ca1c42c78287428468ab1b0"} Apr 16 18:31:41.592512 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.592440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n66hf" event={"ID":"e8425304-94d1-408f-ac22-f5bb6adfce75","Type":"ContainerStarted","Data":"cbdfa85fb76afb3d0a183a1b347646d8b977d4bb7595dc3c9c07e7a8ec5959c7"} Apr 16 18:31:41.593889 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.593869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" event={"ID":"fcc6daec-498a-4d51-950c-80666fb565da","Type":"ContainerStarted","Data":"cd4c16bf391b37b9342f27b3839297de46cd2d325805a58443bb0f420ff425af"} Apr 16 18:31:41.608279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.608224 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cb9f6d684-srnph" podStartSLOduration=1.5885600069999999 podStartE2EDuration="9.608211427s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.950233076 +0000 UTC m=+67.416787944" lastFinishedPulling="2026-04-16 18:31:40.969884496 +0000 UTC m=+75.436439364" observedRunningTime="2026-04-16 18:31:41.606759643 +0000 UTC m=+76.073314528" watchObservedRunningTime="2026-04-16 18:31:41.608211427 +0000 UTC m=+76.074766318" Apr 16 18:31:41.680491 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.680430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jsghj" podStartSLOduration=62.296744109 podStartE2EDuration="1m10.680410368s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.571969651 +0000 UTC m=+67.038524514" lastFinishedPulling="2026-04-16 18:31:40.955635896 +0000 UTC m=+75.422190773" observedRunningTime="2026-04-16 18:31:41.641589381 +0000 UTC m=+76.108144280" watchObservedRunningTime="2026-04-16 18:31:41.680410368 +0000 UTC m=+76.146965248" Apr 16 18:31:41.681201 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.681123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n66hf" podStartSLOduration=67.384732531 podStartE2EDuration="1m15.681114724s" podCreationTimestamp="2026-04-16 18:30:26 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.647590528 +0000 UTC m=+67.114145399" lastFinishedPulling="2026-04-16 18:31:40.943972711 +0000 UTC m=+75.410527592" observedRunningTime="2026-04-16 18:31:41.678778985 +0000 UTC m=+76.145333872" watchObservedRunningTime="2026-04-16 18:31:41.681114724 +0000 UTC m=+76.147669613" Apr 16 18:31:41.700591 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.700523 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9ms5f" podStartSLOduration=34.407494235 podStartE2EDuration="42.700508588s" podCreationTimestamp="2026-04-16 18:30:59 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.651321401 +0000 UTC m=+67.117876265" lastFinishedPulling="2026-04-16 18:31:40.944335754 +0000 UTC m=+75.410890618" observedRunningTime="2026-04-16 18:31:41.698797527 +0000 UTC m=+76.165352452" watchObservedRunningTime="2026-04-16 18:31:41.700508588 +0000 UTC m=+76.167063474" Apr 16 18:31:41.721159 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.721112 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7cbsl" podStartSLOduration=34.250257093 podStartE2EDuration="42.72109794s" podCreationTimestamp="2026-04-16 18:30:59 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.473114227 +0000 UTC m=+66.939669104" lastFinishedPulling="2026-04-16 18:31:40.943955073 +0000 UTC m=+75.410509951" observedRunningTime="2026-04-16 18:31:41.719619966 +0000 UTC m=+76.186174843" watchObservedRunningTime="2026-04-16 18:31:41.72109794 +0000 UTC m=+76.187652826" Apr 16 18:31:41.740950 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.740217 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-t9jsh" podStartSLOduration=62.243927956 podStartE2EDuration="1m10.740200012s" podCreationTimestamp="2026-04-16 18:30:31 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.444357089 +0000 UTC m=+66.910911968" lastFinishedPulling="2026-04-16 18:31:40.940629152 +0000 UTC m=+75.407184024" observedRunningTime="2026-04-16 18:31:41.737180218 +0000 UTC m=+76.203735105" watchObservedRunningTime="2026-04-16 18:31:41.740200012 +0000 UTC m=+76.206754899" Apr 16 18:31:41.812301 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:41.812241 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-687769b44f-9dlmt" podStartSLOduration=1.821500943 podStartE2EDuration="9.812219812s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.972695332 +0000 UTC m=+67.439250201" lastFinishedPulling="2026-04-16 18:31:40.963414191 +0000 UTC m=+75.429969070" observedRunningTime="2026-04-16 18:31:41.807267586 +0000 UTC m=+76.273822498" watchObservedRunningTime="2026-04-16 18:31:41.812219812 +0000 UTC m=+76.278774698" Apr 16 18:31:44.607709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:44.607654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pjfc2" event={"ID":"671bae0e-2470-403d-b4f3-7c607959438a","Type":"ContainerStarted","Data":"8c95192da994df8f40a6e6842fbe4a910c614139eb2ec437b49bfdddd8235cd4"} Apr 16 18:31:44.609850 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:44.609818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" event={"ID":"cd3125e1-4d29-47f7-8cce-daec4c138799","Type":"ContainerStarted","Data":"511d84d8299586cc6506d41b6acf905d1b130bc23f04bc8208abfa6514b71697"} Apr 16 18:31:44.609990 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:44.609855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" event={"ID":"cd3125e1-4d29-47f7-8cce-daec4c138799","Type":"ContainerStarted","Data":"3fd534d8c4e58c612f70d5fbb276600b3fa8187e495adf3708f7c84c7fefd3e8"} Apr 16 18:31:44.626862 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:44.626812 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pjfc2" podStartSLOduration=1.9145914880000001 podStartE2EDuration="12.626795785s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="2026-04-16 18:31:33.175751704 +0000 UTC m=+67.642306571" lastFinishedPulling="2026-04-16 18:31:43.887955999 +0000 UTC m=+78.354510868" observedRunningTime="2026-04-16 18:31:44.625290643 +0000 UTC m=+79.091845526" watchObservedRunningTime="2026-04-16 18:31:44.626795785 +0000 UTC m=+79.093350674" Apr 16 18:31:44.645837 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:44.645600 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-695f4fd687-4fzwq" podStartSLOduration=1.744147242 podStartE2EDuration="12.645583943s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="2026-04-16 18:31:32.991019114 +0000 UTC m=+67.457573978" lastFinishedPulling="2026-04-16 18:31:43.892455808 +0000 UTC m=+78.359010679" observedRunningTime="2026-04-16 18:31:44.644050686 +0000 UTC m=+79.110605575" watchObservedRunningTime="2026-04-16 18:31:44.645583943 +0000 UTC m=+79.112138835" Apr 16 18:31:51.601969 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:51.601922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9ms5f" Apr 16 18:31:52.068195 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.068160 2576 patch_prober.go:28] interesting pod/image-registry-55f7bf856d-b6qxj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:31:52.068423 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.068219 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:52.243671 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.243628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tp9dv"] Apr 16 18:31:52.329465 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.329105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.333146 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.333116 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:31:52.333309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.333290 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:31:52.334056 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.333562 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:31:52.334056 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.333583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fwjvq\"" Apr 16 18:31:52.334056 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.333894 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:31:52.415927 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.415893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-sys\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416082 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.415936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416082 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.415974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-metrics-client-ca\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416082 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-tls\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416082 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-textfile\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xfv\" (UniqueName: \"kubernetes.io/projected/c4b49530-da68-40c1-86b7-5787b7b11a79-kube-api-access-w5xfv\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-root\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.416352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.416227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-wtmp\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.516907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.516875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-wtmp\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.516914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-sys\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.516941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.516967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-metrics-client-ca\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-tls\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-sys\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-wtmp\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517478 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-textfile\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517478 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xfv\" (UniqueName: \"kubernetes.io/projected/c4b49530-da68-40c1-86b7-5787b7b11a79-kube-api-access-w5xfv\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517478 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-root\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517478 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c4b49530-da68-40c1-86b7-5787b7b11a79-root\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517478 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-textfile\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.517919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.517716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b49530-da68-40c1-86b7-5787b7b11a79-metrics-client-ca\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.519569 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.519550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.519735 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.519716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c4b49530-da68-40c1-86b7-5787b7b11a79-node-exporter-tls\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.533781 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.533757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xfv\" (UniqueName: \"kubernetes.io/projected/c4b49530-da68-40c1-86b7-5787b7b11a79-kube-api-access-w5xfv\") pod \"node-exporter-tp9dv\" (UID: \"c4b49530-da68-40c1-86b7-5787b7b11a79\") " pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.645880 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:52.645808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tp9dv" Apr 16 18:31:52.801794 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:31:52.801757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b49530_da68_40c1_86b7_5787b7b11a79.slice/crio-422feb202798ff4eb1322091e2c9bdf1523bed717de6e507c55c55dd7bd7a613 WatchSource:0}: Error finding container 422feb202798ff4eb1322091e2c9bdf1523bed717de6e507c55c55dd7bd7a613: Status 404 returned error can't find the container with id 422feb202798ff4eb1322091e2c9bdf1523bed717de6e507c55c55dd7bd7a613 Apr 16 18:31:53.549073 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.548346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:31:53.641974 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.641888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-fsthh" event={"ID":"b4b5ffce-2f92-4b13-b96b-d7fa243d1a13","Type":"ContainerStarted","Data":"e855c566556498c8e88e84b8247c6c54613e735da67c3944cef6e813c894e04f"} Apr 16 18:31:53.642266 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.642167 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:53.643309 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.643281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9dv" event={"ID":"c4b49530-da68-40c1-86b7-5787b7b11a79","Type":"ContainerStarted","Data":"422feb202798ff4eb1322091e2c9bdf1523bed717de6e507c55c55dd7bd7a613"} Apr 16 18:31:53.652825 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.652802 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-fsthh" Apr 16 18:31:53.661943 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:53.661884 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-fsthh" podStartSLOduration=1.9984455589999999 podStartE2EDuration="21.661868526s" podCreationTimestamp="2026-04-16 18:31:32 +0000 UTC" firstStartedPulling="2026-04-16 18:31:33.182030359 +0000 UTC m=+67.648585235" lastFinishedPulling="2026-04-16 18:31:52.845453334 +0000 UTC m=+87.312008202" observedRunningTime="2026-04-16 18:31:53.660504138 +0000 UTC m=+88.127059025" watchObservedRunningTime="2026-04-16 18:31:53.661868526 +0000 UTC m=+88.128423413" Apr 16 18:31:54.649169 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:54.649124 2576 generic.go:358] "Generic (PLEG): container finished" podID="c4b49530-da68-40c1-86b7-5787b7b11a79" containerID="f25888b348b3e01b7cba22e094f988d0fb31eb505961a73a64af4c6869a67b20" exitCode=0 Apr 16 18:31:54.649343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:54.649215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9dv" event={"ID":"c4b49530-da68-40c1-86b7-5787b7b11a79","Type":"ContainerDied","Data":"f25888b348b3e01b7cba22e094f988d0fb31eb505961a73a64af4c6869a67b20"} Apr 16 18:31:55.655162 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:55.655115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9dv" event={"ID":"c4b49530-da68-40c1-86b7-5787b7b11a79","Type":"ContainerStarted","Data":"7351ed9b3390a161192bf1d773a3615cc16c9bd2d885947491f849080d9d2188"} Apr 16 18:31:55.655162 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:55.655165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9dv" event={"ID":"c4b49530-da68-40c1-86b7-5787b7b11a79","Type":"ContainerStarted","Data":"b1c3aae69f9942a3409e2ca72c9f124c2d12aa23b93960c6f7a5340376d98501"} Apr 16 18:31:55.677848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:55.677768 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tp9dv" podStartSLOduration=2.796736033 podStartE2EDuration="3.677749317s" podCreationTimestamp="2026-04-16 18:31:52 +0000 UTC" firstStartedPulling="2026-04-16 18:31:52.803865403 +0000 UTC m=+87.270420273" lastFinishedPulling="2026-04-16 18:31:53.684878682 +0000 UTC m=+88.151433557" observedRunningTime="2026-04-16 18:31:55.67562588 +0000 UTC m=+90.142180768" watchObservedRunningTime="2026-04-16 18:31:55.677749317 +0000 UTC m=+90.144304204" Apr 16 18:31:59.783961 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:31:59.783931 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:32:03.545575 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:03.545542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qbq69" Apr 16 18:32:22.739221 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:22.739187 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fe5dd28-9069-4e1e-9331-ddd24da0b5f2" containerID="695440e7264caad758cd23182306bd18762f6bb4131b18f819913594d9ca3831" exitCode=0 Apr 16 18:32:22.739643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:22.739262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" event={"ID":"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2","Type":"ContainerDied","Data":"695440e7264caad758cd23182306bd18762f6bb4131b18f819913594d9ca3831"} Apr 16 18:32:22.739643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:22.739619 2576 scope.go:117] "RemoveContainer" containerID="695440e7264caad758cd23182306bd18762f6bb4131b18f819913594d9ca3831" Apr 16 18:32:23.743880 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:23.743846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-g7zhh" event={"ID":"3fe5dd28-9069-4e1e-9331-ddd24da0b5f2","Type":"ContainerStarted","Data":"ac36b9215c1e1422b66cafd69d6cce41c56895a8f4daec6dde8b29d323a80db8"} Apr 16 18:32:24.811473 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:24.811389 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerName="registry" containerID="cri-o://c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630" gracePeriod=30 Apr 16 18:32:25.077701 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.077676 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:32:25.202484 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202448 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202484 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202489 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202548 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202589 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202619 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202689 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202730 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdgtn\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.202919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.202765 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") pod \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\" (UID: \"9eafbaff-2bb8-4c09-a410-a5e054fefae3\") " Apr 16 18:32:25.203280 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.203216 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:25.203491 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.203462 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:32:25.205898 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.205861 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:25.205898 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.205856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:25.206049 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.205886 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:25.206049 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.205979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn" (OuterVolumeSpecName: "kube-api-access-cdgtn") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "kube-api-access-cdgtn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:32:25.206049 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.205997 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:32:25.211636 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.211608 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9eafbaff-2bb8-4c09-a410-a5e054fefae3" (UID: "9eafbaff-2bb8-4c09-a410-a5e054fefae3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:25.304277 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304242 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cdgtn\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-kube-api-access-cdgtn\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304277 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304270 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-tls\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304277 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304279 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-installation-pull-secrets\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304288 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9eafbaff-2bb8-4c09-a410-a5e054fefae3-ca-trust-extracted\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304298 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9eafbaff-2bb8-4c09-a410-a5e054fefae3-image-registry-private-configuration\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304308 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-registry-certificates\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304316 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9eafbaff-2bb8-4c09-a410-a5e054fefae3-bound-sa-token\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.304524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.304324 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9eafbaff-2bb8-4c09-a410-a5e054fefae3-trusted-ca\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:32:25.750736 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.750700 2576 generic.go:358] "Generic (PLEG): container finished" podID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerID="c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630" exitCode=0 Apr 16 18:32:25.750903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.750761 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" Apr 16 18:32:25.750903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.750773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" event={"ID":"9eafbaff-2bb8-4c09-a410-a5e054fefae3","Type":"ContainerDied","Data":"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630"} Apr 16 18:32:25.750903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.750808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55f7bf856d-b6qxj" event={"ID":"9eafbaff-2bb8-4c09-a410-a5e054fefae3","Type":"ContainerDied","Data":"22222db35e07ea8111178aeb10706bbe7b1c99ae13de662868ec37738ff9dcd2"} Apr 16 18:32:25.750903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.750824 2576 scope.go:117] "RemoveContainer" containerID="c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630" Apr 16 18:32:25.759630 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.759612 2576 scope.go:117] "RemoveContainer" containerID="c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630" Apr 16 18:32:25.759900 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:32:25.759878 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630\": container with ID starting with c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630 not found: ID does not exist" containerID="c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630" Apr 16 18:32:25.759952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.759910 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630"} err="failed to get container status \"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630\": rpc error: code = NotFound desc = could not find container \"c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630\": container with ID starting with c94a7e0353219310f48e9abfa5938269335aa69cac64668adea418d8cb38f630 not found: ID does not exist" Apr 16 18:32:25.775164 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.775139 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:32:25.780647 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:25.780627 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-55f7bf856d-b6qxj"] Apr 16 18:32:26.179857 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:26.179824 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" path="/var/lib/kubelet/pods/9eafbaff-2bb8-4c09-a410-a5e054fefae3/volumes" Apr 16 18:32:27.760843 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:27.760808 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e48aa88-413f-40b4-bf6a-2dc0acc72e3a" containerID="a30b2bfc4ef91bfcd3de9ca09020f17589d6992aa34add619d6885fd81e47657" exitCode=0 Apr 16 18:32:27.761168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:27.760877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" event={"ID":"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a","Type":"ContainerDied","Data":"a30b2bfc4ef91bfcd3de9ca09020f17589d6992aa34add619d6885fd81e47657"} Apr 16 18:32:27.761213 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:27.761200 2576 scope.go:117] "RemoveContainer" containerID="a30b2bfc4ef91bfcd3de9ca09020f17589d6992aa34add619d6885fd81e47657" Apr 16 18:32:28.765958 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:28.765922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vw2xc" event={"ID":"3e48aa88-413f-40b4-bf6a-2dc0acc72e3a","Type":"ContainerStarted","Data":"e942bb21341d0f597104536941a4aa3cd07381037c2c382ec3af9902c0cf2e81"} Apr 16 18:32:32.777948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:32.777910 2576 generic.go:358] "Generic (PLEG): container finished" podID="ace4cbeb-ecb8-4ffc-b087-db80889cc00f" containerID="0232b20447cc55bc4f3b2c5d25ec7a6aeb83e1174deb5991b254a94f5e73422e" exitCode=0 Apr 16 18:32:32.778310 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:32.777965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" event={"ID":"ace4cbeb-ecb8-4ffc-b087-db80889cc00f","Type":"ContainerDied","Data":"0232b20447cc55bc4f3b2c5d25ec7a6aeb83e1174deb5991b254a94f5e73422e"} Apr 16 18:32:32.778355 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:32.778337 2576 scope.go:117] "RemoveContainer" containerID="0232b20447cc55bc4f3b2c5d25ec7a6aeb83e1174deb5991b254a94f5e73422e" Apr 16 18:32:33.781909 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:32:33.781874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-5qhhj" event={"ID":"ace4cbeb-ecb8-4ffc-b087-db80889cc00f","Type":"ContainerStarted","Data":"2df03d7396162456b285935fa5b51d2ea9093f008d7e4b297df21c6ce2be49ba"} Apr 16 18:35:26.051893 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:35:26.051866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:35:26.053867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:35:26.053846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:35:26.066269 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:35:26.066247 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:36:53.301267 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.301233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4"] Apr 16 18:36:53.301743 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.301575 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerName="registry" Apr 16 18:36:53.301743 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.301588 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerName="registry" Apr 16 18:36:53.301743 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.301644 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9eafbaff-2bb8-4c09-a410-a5e054fefae3" containerName="registry" Apr 16 18:36:53.304434 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.304418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.307596 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.307575 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 18:36:53.308272 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.308254 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 18:36:53.308497 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.308478 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:36:53.308592 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.308501 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 18:36:53.308592 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.308525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-277fx\"" Apr 16 18:36:53.308592 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.308583 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 18:36:53.323271 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.323251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4"] Apr 16 18:36:53.345699 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.345668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6qpq\" (UniqueName: \"kubernetes.io/projected/331f686d-81a7-475d-8b25-fa2ec126dc59-kube-api-access-p6qpq\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.345842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.345727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.345842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.345787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/331f686d-81a7-475d-8b25-fa2ec126dc59-manager-config\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.345842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.345817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.446374 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.446343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.446546 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.446419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/331f686d-81a7-475d-8b25-fa2ec126dc59-manager-config\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.446546 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.446445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.446546 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.446477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6qpq\" (UniqueName: \"kubernetes.io/projected/331f686d-81a7-475d-8b25-fa2ec126dc59-kube-api-access-p6qpq\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.447113 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.447091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/331f686d-81a7-475d-8b25-fa2ec126dc59-manager-config\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.449279 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.449254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.449384 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.449254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/331f686d-81a7-475d-8b25-fa2ec126dc59-metrics-cert\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.455797 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.455777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6qpq\" (UniqueName: \"kubernetes.io/projected/331f686d-81a7-475d-8b25-fa2ec126dc59-kube-api-access-p6qpq\") pod \"lws-controller-manager-7fd84c546d-lxlf4\" (UID: \"331f686d-81a7-475d-8b25-fa2ec126dc59\") " pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.613721 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.613691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:36:53.738316 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.738286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4"] Apr 16 18:36:53.740738 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:36:53.740710 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331f686d_81a7_475d_8b25_fa2ec126dc59.slice/crio-5e6e825c0a784fec8688403bec4268b3155675977c5d34ef748c0e28cbb81a36 WatchSource:0}: Error finding container 5e6e825c0a784fec8688403bec4268b3155675977c5d34ef748c0e28cbb81a36: Status 404 returned error can't find the container with id 5e6e825c0a784fec8688403bec4268b3155675977c5d34ef748c0e28cbb81a36 Apr 16 18:36:53.742566 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:53.742545 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:36:54.530557 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:54.530502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" event={"ID":"331f686d-81a7-475d-8b25-fa2ec126dc59","Type":"ContainerStarted","Data":"5e6e825c0a784fec8688403bec4268b3155675977c5d34ef748c0e28cbb81a36"} Apr 16 18:36:56.538579 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:56.538546 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" event={"ID":"331f686d-81a7-475d-8b25-fa2ec126dc59","Type":"ContainerStarted","Data":"b7970fc113677234d494f9f82e3cb63581eeb0b55964dac04cb2c9b3946490e7"} Apr 16 18:36:56.538994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:36:56.538621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:37:07.544169 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:07.544137 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" Apr 16 18:37:07.580296 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:07.580248 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7fd84c546d-lxlf4" podStartSLOduration=12.023334341 podStartE2EDuration="14.580234041s" podCreationTimestamp="2026-04-16 18:36:53 +0000 UTC" firstStartedPulling="2026-04-16 18:36:53.742673586 +0000 UTC m=+388.209228450" lastFinishedPulling="2026-04-16 18:36:56.299573286 +0000 UTC m=+390.766128150" observedRunningTime="2026-04-16 18:36:56.572658141 +0000 UTC m=+391.039213027" watchObservedRunningTime="2026-04-16 18:37:07.580234041 +0000 UTC m=+402.046788989" Apr 16 18:37:41.230094 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.230011 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6"] Apr 16 18:37:41.233314 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.233297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.239475 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.239453 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 18:37:41.239599 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.239516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6sw6v\"" Apr 16 18:37:41.240360 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.240341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 18:37:41.240681 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.240662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 18:37:41.240782 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.240686 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 18:37:41.257435 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.257383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6"] Apr 16 18:37:41.311594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.311560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.311724 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.311651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.311724 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.311713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbpz\" (UniqueName: \"kubernetes.io/projected/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-kube-api-access-4dbpz\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.412277 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.412235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.412277 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.412285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.412567 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.412343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbpz\" (UniqueName: \"kubernetes.io/projected/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-kube-api-access-4dbpz\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.412567 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:37:41.412416 2576 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 18:37:41.412567 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:37:41.412501 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert podName:c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f nodeName:}" failed. No retries permitted until 2026-04-16 18:37:41.912478439 +0000 UTC m=+436.379033307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-b28b6" (UID: "c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f") : secret "plugin-serving-cert" not found Apr 16 18:37:41.413145 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.413121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.421034 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.421005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbpz\" (UniqueName: \"kubernetes.io/projected/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-kube-api-access-4dbpz\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.917844 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.917810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:41.920298 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:41.920267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-b28b6\" (UID: \"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:42.144819 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:42.144783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" Apr 16 18:37:42.267043 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:42.267016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6"] Apr 16 18:37:42.269120 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:37:42.269090 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc771b2d7_ddc6_4d54_bfd4_d50d3bfd300f.slice/crio-3373039a4d42d4b41c56c54f243e6eab327b5a41c69e9654cfe95034797caa42 WatchSource:0}: Error finding container 3373039a4d42d4b41c56c54f243e6eab327b5a41c69e9654cfe95034797caa42: Status 404 returned error can't find the container with id 3373039a4d42d4b41c56c54f243e6eab327b5a41c69e9654cfe95034797caa42 Apr 16 18:37:42.671181 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:42.671135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" event={"ID":"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f","Type":"ContainerStarted","Data":"3373039a4d42d4b41c56c54f243e6eab327b5a41c69e9654cfe95034797caa42"} Apr 16 18:37:47.688674 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:47.688638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" event={"ID":"c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f","Type":"ContainerStarted","Data":"aba705d2c466f60a8dfb47ab98ca42deddfe69f7e8f5ff397a23566acb84ef57"} Apr 16 18:37:47.706684 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:37:47.706641 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-b28b6" podStartSLOduration=2.269021471 podStartE2EDuration="6.706628872s" podCreationTimestamp="2026-04-16 18:37:41 +0000 UTC" firstStartedPulling="2026-04-16 18:37:42.270450834 +0000 UTC m=+436.737005698" lastFinishedPulling="2026-04-16 18:37:46.708058233 +0000 UTC m=+441.174613099" observedRunningTime="2026-04-16 18:37:47.704212479 +0000 UTC m=+442.170767364" watchObservedRunningTime="2026-04-16 18:37:47.706628872 +0000 UTC m=+442.173183757" Apr 16 18:40:24.297149 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.297108 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-r2nnm"] Apr 16 18:40:24.300736 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.300717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:24.303340 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.303314 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:40:24.303513 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.303314 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:40:24.304668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.304641 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:40:24.304668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.304653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-ft86l\"" Apr 16 18:40:24.315862 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.315836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r2nnm"] Apr 16 18:40:24.416526 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.416481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfhvm\" (UniqueName: \"kubernetes.io/projected/346719e6-ab00-4d86-86d0-7327fe9168a6-kube-api-access-gfhvm\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:24.416732 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.416542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:24.517507 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.517472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfhvm\" (UniqueName: \"kubernetes.io/projected/346719e6-ab00-4d86-86d0-7327fe9168a6-kube-api-access-gfhvm\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:24.517718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.517515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:24.517718 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:40:24.517647 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 18:40:24.517718 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:40:24.517711 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs podName:346719e6-ab00-4d86-86d0-7327fe9168a6 nodeName:}" failed. No retries permitted until 2026-04-16 18:40:25.017694842 +0000 UTC m=+599.484249706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs") pod "model-serving-api-86f7b4b499-r2nnm" (UID: "346719e6-ab00-4d86-86d0-7327fe9168a6") : secret "model-serving-api-tls" not found Apr 16 18:40:24.528361 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:24.528330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfhvm\" (UniqueName: \"kubernetes.io/projected/346719e6-ab00-4d86-86d0-7327fe9168a6-kube-api-access-gfhvm\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:25.021378 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:25.021336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:25.023902 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:25.023883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/346719e6-ab00-4d86-86d0-7327fe9168a6-tls-certs\") pod \"model-serving-api-86f7b4b499-r2nnm\" (UID: \"346719e6-ab00-4d86-86d0-7327fe9168a6\") " pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:25.211662 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:25.211630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:25.337659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:25.337632 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-r2nnm"] Apr 16 18:40:25.339640 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:40:25.339610 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346719e6_ab00_4d86_86d0_7327fe9168a6.slice/crio-b9d1c2d7e04a5121a2cd5f5d00fc3e72eb8256d977dea3b1f18e45a6d6880b7c WatchSource:0}: Error finding container b9d1c2d7e04a5121a2cd5f5d00fc3e72eb8256d977dea3b1f18e45a6d6880b7c: Status 404 returned error can't find the container with id b9d1c2d7e04a5121a2cd5f5d00fc3e72eb8256d977dea3b1f18e45a6d6880b7c Apr 16 18:40:26.083681 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:26.083653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:40:26.083873 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:26.083744 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:40:26.150998 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:26.150963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r2nnm" event={"ID":"346719e6-ab00-4d86-86d0-7327fe9168a6","Type":"ContainerStarted","Data":"b9d1c2d7e04a5121a2cd5f5d00fc3e72eb8256d977dea3b1f18e45a6d6880b7c"} Apr 16 18:40:28.159299 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:28.159262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-r2nnm" event={"ID":"346719e6-ab00-4d86-86d0-7327fe9168a6","Type":"ContainerStarted","Data":"6d824e0a68b14ac00246db41fcc5b3a27510aa25fa025b5cb98dc61a300d7a7e"} Apr 16 18:40:28.159695 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:28.159426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:40:28.196636 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:28.196586 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-r2nnm" podStartSLOduration=2.076002101 podStartE2EDuration="4.196572341s" podCreationTimestamp="2026-04-16 18:40:24 +0000 UTC" firstStartedPulling="2026-04-16 18:40:25.341526702 +0000 UTC m=+599.808081565" lastFinishedPulling="2026-04-16 18:40:27.462096942 +0000 UTC m=+601.928651805" observedRunningTime="2026-04-16 18:40:28.194717925 +0000 UTC m=+602.661272815" watchObservedRunningTime="2026-04-16 18:40:28.196572341 +0000 UTC m=+602.663127227" Apr 16 18:40:39.167172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:40:39.167095 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-r2nnm" Apr 16 18:41:08.413636 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.413598 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:41:08.417003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.416967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.419769 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.419742 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:41:08.420553 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.420530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 18:41:08.420669 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.420605 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:41:08.420669 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.420633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:41:08.420776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.420612 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-x25bg\"" Apr 16 18:41:08.432308 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.432283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:41:08.466581 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.466581 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.466842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.466842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ghk\" (UniqueName: \"kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.466842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.466984 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.466845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.567820 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.567820 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568072 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568072 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568072 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568072 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.567991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ghk\" (UniqueName: \"kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568325 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.568304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568385 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.568330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568483 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.568465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.568529 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.568490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.570625 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.570601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.578254 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.578229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ghk\" (UniqueName: \"kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.727184 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.727087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:08.856334 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:08.856312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:41:08.858760 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:41:08.858731 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8bffa12_5603_4c0b_88ca_24aead643a33.slice/crio-9362a09fcb022e638f651a67f81cd017b92ea48e1f5d28aef9e5297f36a42915 WatchSource:0}: Error finding container 9362a09fcb022e638f651a67f81cd017b92ea48e1f5d28aef9e5297f36a42915: Status 404 returned error can't find the container with id 9362a09fcb022e638f651a67f81cd017b92ea48e1f5d28aef9e5297f36a42915 Apr 16 18:41:09.283799 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:09.283752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerStarted","Data":"9362a09fcb022e638f651a67f81cd017b92ea48e1f5d28aef9e5297f36a42915"} Apr 16 18:41:12.294780 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:12.294686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerStarted","Data":"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00"} Apr 16 18:41:13.299236 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:13.299203 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerID="388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00" exitCode=0 Apr 16 18:41:13.299649 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:13.299299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerDied","Data":"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00"} Apr 16 18:41:15.313799 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:15.313745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerStarted","Data":"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815"} Apr 16 18:41:44.413101 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:44.413007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerStarted","Data":"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68"} Apr 16 18:41:44.413547 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:44.413236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:44.415884 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:44.415863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:44.436127 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:44.436076 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" podStartSLOduration=1.226166702 podStartE2EDuration="36.436059728s" podCreationTimestamp="2026-04-16 18:41:08 +0000 UTC" firstStartedPulling="2026-04-16 18:41:08.860587039 +0000 UTC m=+643.327141904" lastFinishedPulling="2026-04-16 18:41:44.070480063 +0000 UTC m=+678.537034930" observedRunningTime="2026-04-16 18:41:44.434055841 +0000 UTC m=+678.900610727" watchObservedRunningTime="2026-04-16 18:41:44.436059728 +0000 UTC m=+678.902614615" Apr 16 18:41:48.727555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:48.727518 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:48.727555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:48.727564 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:58.729192 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:58.729157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:41:58.730500 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:41:58.730476 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:42:00.394523 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:00.394489 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:42:00.466485 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:00.466448 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="tokenizer" containerID="cri-o://fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68" gracePeriod=30 Apr 16 18:42:00.466676 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:00.466435 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="main" containerID="cri-o://f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815" gracePeriod=30 Apr 16 18:42:01.471297 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.471267 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerID="f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815" exitCode=0 Apr 16 18:42:01.471792 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.471318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerDied","Data":"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815"} Apr 16 18:42:01.709958 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.709935 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:42:01.732441 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732345 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732441 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732389 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8ghk\" (UniqueName: \"kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732637 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732446 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732637 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732637 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732521 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732637 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732548 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs\") pod \"c8bffa12-5603-4c0b-88ca-24aead643a33\" (UID: \"c8bffa12-5603-4c0b-88ca-24aead643a33\") " Apr 16 18:42:01.732842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732780 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:01.732842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732822 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:01.732960 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.732864 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:01.733216 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.733195 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:01.735185 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.735155 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk" (OuterVolumeSpecName: "kube-api-access-s8ghk") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "kube-api-access-s8ghk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:42:01.735339 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.735193 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c8bffa12-5603-4c0b-88ca-24aead643a33" (UID: "c8bffa12-5603-4c0b-88ca-24aead643a33"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:42:01.833921 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833891 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:01.833921 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833919 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s8ghk\" (UniqueName: \"kubernetes.io/projected/c8bffa12-5603-4c0b-88ca-24aead643a33-kube-api-access-s8ghk\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:01.834107 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833929 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:01.834107 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833939 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:01.834107 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833948 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bffa12-5603-4c0b-88ca-24aead643a33-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:01.834107 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:01.833956 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bffa12-5603-4c0b-88ca-24aead643a33-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:42:02.476251 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.476149 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerID="fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68" exitCode=0 Apr 16 18:42:02.476772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.476249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerDied","Data":"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68"} Apr 16 18:42:02.476772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.476278 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" Apr 16 18:42:02.476772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.476295 2576 scope.go:117] "RemoveContainer" containerID="fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68" Apr 16 18:42:02.476772 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.476283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw" event={"ID":"c8bffa12-5603-4c0b-88ca-24aead643a33","Type":"ContainerDied","Data":"9362a09fcb022e638f651a67f81cd017b92ea48e1f5d28aef9e5297f36a42915"} Apr 16 18:42:02.484420 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.484379 2576 scope.go:117] "RemoveContainer" containerID="f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815" Apr 16 18:42:02.491952 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.491930 2576 scope.go:117] "RemoveContainer" containerID="388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00" Apr 16 18:42:02.496038 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.496013 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:42:02.499165 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.499141 2576 scope.go:117] "RemoveContainer" containerID="fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68" Apr 16 18:42:02.499455 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:42:02.499432 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68\": container with ID starting with fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68 not found: ID does not exist" containerID="fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68" Apr 16 18:42:02.499575 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.499465 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68"} err="failed to get container status \"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68\": rpc error: code = NotFound desc = could not find container \"fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68\": container with ID starting with fc4ab626254a5d4243cb155bcfee48086977bdb0f1cb25d857a2d58479dcfb68 not found: ID does not exist" Apr 16 18:42:02.499575 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.499489 2576 scope.go:117] "RemoveContainer" containerID="f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815" Apr 16 18:42:02.499778 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:42:02.499751 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815\": container with ID starting with f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815 not found: ID does not exist" containerID="f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815" Apr 16 18:42:02.499884 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.499785 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815"} err="failed to get container status \"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815\": rpc error: code = NotFound desc = could not find container \"f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815\": container with ID starting with f3907dbbfa07e7a0c71ce8ed71ba0bda331adab13e70efc52147a4e192ee3815 not found: ID does not exist" Apr 16 18:42:02.499884 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.499808 2576 scope.go:117] "RemoveContainer" containerID="388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00" Apr 16 18:42:02.500382 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:42:02.500359 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00\": container with ID starting with 388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00 not found: ID does not exist" containerID="388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00" Apr 16 18:42:02.500497 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.500386 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00"} err="failed to get container status \"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00\": rpc error: code = NotFound desc = could not find container \"388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00\": container with ID starting with 388f044bf566e81a88a836ab7de9462f36de9537b895ecaa747fce2ba4eedc00 not found: ID does not exist" Apr 16 18:42:02.501650 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:02.501630 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-7d4849pfmw"] Apr 16 18:42:04.179182 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:04.179139 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" path="/var/lib/kubelet/pods/c8bffa12-5603-4c0b-88ca-24aead643a33/volumes" Apr 16 18:42:10.286144 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286062 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286425 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="tokenizer" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286439 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="tokenizer" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286449 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="storage-initializer" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286454 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="storage-initializer" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286474 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="main" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286479 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="main" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286536 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="main" Apr 16 18:42:10.286544 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.286547 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8bffa12-5603-4c0b-88ca-24aead643a33" containerName="tokenizer" Apr 16 18:42:10.532937 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.532900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:42:10.533119 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.533046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.536716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.536633 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:42:10.536716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.536633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 18:42:10.536946 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.536765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:42:10.536946 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.536841 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:42:10.546643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.546624 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:42:10.568542 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.568520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:42:10.568672 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.568637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.571207 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.571188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-5n2pf\"" Apr 16 18:42:10.610792 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.610918 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.610918 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4kq\" (UniqueName: \"kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.610918 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.610918 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.611048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.611048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlmj\" (UniqueName: \"kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.611048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.610969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.611048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.611011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.611165 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.611055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.611165 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.611085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.611165 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.611102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.711776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.711776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4kq\" (UniqueName: \"kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712008 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.711933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712248 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712364 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712581 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlmj\" (UniqueName: \"kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712581 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712581 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.712746 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712823 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.712868 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.712808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.714963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.714937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.714963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.714939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.715115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.715078 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.720666 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.720642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlmj\" (UniqueName: \"kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.720789 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.720720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4kq\" (UniqueName: \"kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.844574 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.844536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:10.878568 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.878532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:10.980888 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.979237 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:42:10.986896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:10.986866 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:42:11.020668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:11.020637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:42:11.023578 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:42:11.023552 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc628b6de_fbba_4d3e_b47f_e3b271191168.slice/crio-0dde2569f9341958da850bfcd31943443a4948ec982b27e0009794bdd9eaa24a WatchSource:0}: Error finding container 0dde2569f9341958da850bfcd31943443a4948ec982b27e0009794bdd9eaa24a: Status 404 returned error can't find the container with id 0dde2569f9341958da850bfcd31943443a4948ec982b27e0009794bdd9eaa24a Apr 16 18:42:11.502698 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:11.502608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerStarted","Data":"81f478570f1322291197ecb7dcedfb8b47fa2683dea34bec081e0dd1bfe6d2af"} Apr 16 18:42:11.502698 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:11.502652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerStarted","Data":"0dde2569f9341958da850bfcd31943443a4948ec982b27e0009794bdd9eaa24a"} Apr 16 18:42:11.504257 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:11.504230 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerStarted","Data":"c3c1487c8e1128cb83c9b67271c2a9bb65bf4b48746db2581b9c11cfe53653ab"} Apr 16 18:42:11.504334 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:11.504267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerStarted","Data":"0b96e53e669eb5a3ac975ac60e90544ab3b0b7fa86cdeb135ccceaf9815e3622"} Apr 16 18:42:12.512281 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:12.512202 2576 generic.go:358] "Generic (PLEG): container finished" podID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerID="81f478570f1322291197ecb7dcedfb8b47fa2683dea34bec081e0dd1bfe6d2af" exitCode=0 Apr 16 18:42:12.512795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:12.512298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerDied","Data":"81f478570f1322291197ecb7dcedfb8b47fa2683dea34bec081e0dd1bfe6d2af"} Apr 16 18:42:13.519409 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:13.519350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerStarted","Data":"b941a023b4b38e6cd120e31bb77363b941104e2478e33c405431328f0e1a49f1"} Apr 16 18:42:13.519409 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:13.519413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerStarted","Data":"6947f05b3adc5929ade5c5c3e72f9c802502a1c128552227d298b992c06a44c8"} Apr 16 18:42:13.519843 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:13.519755 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:13.542848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:13.542795 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" podStartSLOduration=3.542778137 podStartE2EDuration="3.542778137s" podCreationTimestamp="2026-04-16 18:42:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:42:13.541418316 +0000 UTC m=+708.007973202" watchObservedRunningTime="2026-04-16 18:42:13.542778137 +0000 UTC m=+708.009333057" Apr 16 18:42:15.528628 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:15.528542 2576 generic.go:358] "Generic (PLEG): container finished" podID="328a58f6-6331-4215-9f0a-fce75780582a" containerID="c3c1487c8e1128cb83c9b67271c2a9bb65bf4b48746db2581b9c11cfe53653ab" exitCode=0 Apr 16 18:42:15.528628 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:15.528589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerDied","Data":"c3c1487c8e1128cb83c9b67271c2a9bb65bf4b48746db2581b9c11cfe53653ab"} Apr 16 18:42:17.536605 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:17.536568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerStarted","Data":"743c7dcfdade4ff692c94c0dc7824335438f511e7d31c30bbce210a33bb0c750"} Apr 16 18:42:17.560761 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:17.560701 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" podStartSLOduration=6.105056257 podStartE2EDuration="7.560686762s" podCreationTimestamp="2026-04-16 18:42:10 +0000 UTC" firstStartedPulling="2026-04-16 18:42:15.529610329 +0000 UTC m=+709.996165192" lastFinishedPulling="2026-04-16 18:42:16.985240824 +0000 UTC m=+711.451795697" observedRunningTime="2026-04-16 18:42:17.558754902 +0000 UTC m=+712.025309789" watchObservedRunningTime="2026-04-16 18:42:17.560686762 +0000 UTC m=+712.027241648" Apr 16 18:42:20.845686 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.845650 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:20.845686 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.845692 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:20.858721 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.858698 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:20.878787 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.878756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:20.878936 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.878796 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:20.882064 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.882034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:20.897278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.897244 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:42:20.935338 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.935300 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:42:20.935520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.935446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:20.938376 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.938358 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 18:42:20.938476 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:20.938446 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-24dfl\"" Apr 16 18:42:21.002773 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.002730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwkj\" (UniqueName: \"kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.002948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.002776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.002948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.002884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.002948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.002933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.003135 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.002994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.003135 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.003015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.103900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.103809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwkj\" (UniqueName: \"kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.103900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.103849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.103900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.103883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104171 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104006 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104171 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104171 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104290 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104353 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.104456 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.104437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.106703 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.106683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.112239 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.112220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwkj\" (UniqueName: \"kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.245345 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.245309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:21.380646 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.380570 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:42:21.384546 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:42:21.384513 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4cb74ed_fb93_49a8_8b42_687dab32388a.slice/crio-b4fc7915334c6ef9e8192bd6171c5da5bf569d9557c7e750434f5e756b821ea5 WatchSource:0}: Error finding container b4fc7915334c6ef9e8192bd6171c5da5bf569d9557c7e750434f5e756b821ea5: Status 404 returned error can't find the container with id b4fc7915334c6ef9e8192bd6171c5da5bf569d9557c7e750434f5e756b821ea5 Apr 16 18:42:21.550781 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.550736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerStarted","Data":"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89"} Apr 16 18:42:21.550781 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.550779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerStarted","Data":"b4fc7915334c6ef9e8192bd6171c5da5bf569d9557c7e750434f5e756b821ea5"} Apr 16 18:42:21.552094 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.552068 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:21.563828 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:21.563805 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:42:22.556033 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:22.555998 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerID="adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89" exitCode=0 Apr 16 18:42:22.556500 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:22.556083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerDied","Data":"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89"} Apr 16 18:42:23.562023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:23.561990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerStarted","Data":"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2"} Apr 16 18:42:23.562023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:23.562022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerStarted","Data":"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea"} Apr 16 18:42:23.562457 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:23.562143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:23.587136 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:23.587079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" podStartSLOduration=3.587064144 podStartE2EDuration="3.587064144s" podCreationTimestamp="2026-04-16 18:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:42:23.585194546 +0000 UTC m=+718.051749431" watchObservedRunningTime="2026-04-16 18:42:23.587064144 +0000 UTC m=+718.053619028" Apr 16 18:42:31.245923 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:31.245891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:31.245923 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:31.245930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:31.248503 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:31.248480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:31.589244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:31.589203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:42:42.558254 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:42.558226 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:42:52.594354 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:42:52.594324 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:43:18.610381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.610346 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:43:18.610925 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.610768 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="main" containerID="cri-o://743c7dcfdade4ff692c94c0dc7824335438f511e7d31c30bbce210a33bb0c750" gracePeriod=30 Apr 16 18:43:18.613791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.613768 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:43:18.614177 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.614124 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="main" containerID="cri-o://6947f05b3adc5929ade5c5c3e72f9c802502a1c128552227d298b992c06a44c8" gracePeriod=30 Apr 16 18:43:18.614326 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.614302 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="tokenizer" containerID="cri-o://b941a023b4b38e6cd120e31bb77363b941104e2478e33c405431328f0e1a49f1" gracePeriod=30 Apr 16 18:43:18.751596 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.751562 2576 generic.go:358] "Generic (PLEG): container finished" podID="328a58f6-6331-4215-9f0a-fce75780582a" containerID="743c7dcfdade4ff692c94c0dc7824335438f511e7d31c30bbce210a33bb0c750" exitCode=0 Apr 16 18:43:18.751771 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.751640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerDied","Data":"743c7dcfdade4ff692c94c0dc7824335438f511e7d31c30bbce210a33bb0c750"} Apr 16 18:43:18.753571 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.753550 2576 generic.go:358] "Generic (PLEG): container finished" podID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerID="6947f05b3adc5929ade5c5c3e72f9c802502a1c128552227d298b992c06a44c8" exitCode=0 Apr 16 18:43:18.753688 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.753593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerDied","Data":"6947f05b3adc5929ade5c5c3e72f9c802502a1c128552227d298b992c06a44c8"} Apr 16 18:43:18.861081 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:18.861015 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:43:19.007706 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007667 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.007706 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007711 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.007947 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjlmj\" (UniqueName: \"kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.007947 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007765 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.007947 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.007947 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.007849 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache\") pod \"328a58f6-6331-4215-9f0a-fce75780582a\" (UID: \"328a58f6-6331-4215-9f0a-fce75780582a\") " Apr 16 18:43:19.008172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.008023 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home" (OuterVolumeSpecName: "home") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.008677 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.008650 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-home\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.016189 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.008993 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache" (OuterVolumeSpecName: "model-cache") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.016189 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.011585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj" (OuterVolumeSpecName: "kube-api-access-xjlmj") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "kube-api-access-xjlmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:19.016189 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.012204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm" (OuterVolumeSpecName: "dshm") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.016572 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.016548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:19.069850 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.069788 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "328a58f6-6331-4215-9f0a-fce75780582a" (UID: "328a58f6-6331-4215-9f0a-fce75780582a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:19.109425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.109378 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjlmj\" (UniqueName: \"kubernetes.io/projected/328a58f6-6331-4215-9f0a-fce75780582a-kube-api-access-xjlmj\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.109425 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.109422 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/328a58f6-6331-4215-9f0a-fce75780582a-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.109635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.109434 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-dshm\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.109635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.109446 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-model-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.109635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.109459 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/328a58f6-6331-4215-9f0a-fce75780582a-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:19.759966 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.759934 2576 generic.go:358] "Generic (PLEG): container finished" podID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerID="b941a023b4b38e6cd120e31bb77363b941104e2478e33c405431328f0e1a49f1" exitCode=0 Apr 16 18:43:19.760361 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.759989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerDied","Data":"b941a023b4b38e6cd120e31bb77363b941104e2478e33c405431328f0e1a49f1"} Apr 16 18:43:19.761662 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.761636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" event={"ID":"328a58f6-6331-4215-9f0a-fce75780582a","Type":"ContainerDied","Data":"0b96e53e669eb5a3ac975ac60e90544ab3b0b7fa86cdeb135ccceaf9815e3622"} Apr 16 18:43:19.761767 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.761677 2576 scope.go:117] "RemoveContainer" containerID="743c7dcfdade4ff692c94c0dc7824335438f511e7d31c30bbce210a33bb0c750" Apr 16 18:43:19.761767 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.761700 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg" Apr 16 18:43:19.770643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.770622 2576 scope.go:117] "RemoveContainer" containerID="c3c1487c8e1128cb83c9b67271c2a9bb65bf4b48746db2581b9c11cfe53653ab" Apr 16 18:43:19.788010 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.787985 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:43:19.794887 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.794863 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-gl4wg"] Apr 16 18:43:19.876017 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:19.875988 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:43:20.017241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017240 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017499 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017266 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017499 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017285 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017499 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017499 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017323 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn4kq\" (UniqueName: \"kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq\") pod \"c628b6de-fbba-4d3e-b47f-e3b271191168\" (UID: \"c628b6de-fbba-4d3e-b47f-e3b271191168\") " Apr 16 18:43:20.017755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017588 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.017755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017589 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.017755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.017623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.018054 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.018030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:20.019623 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.019601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:43:20.019695 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.019656 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq" (OuterVolumeSpecName: "kube-api-access-tn4kq") pod "c628b6de-fbba-4d3e-b47f-e3b271191168" (UID: "c628b6de-fbba-4d3e-b47f-e3b271191168"). InnerVolumeSpecName "kube-api-access-tn4kq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:43:20.118967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118914 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.118967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118962 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.118967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118972 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.118967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118981 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c628b6de-fbba-4d3e-b47f-e3b271191168-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.119244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118990 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c628b6de-fbba-4d3e-b47f-e3b271191168-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.119244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.118999 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tn4kq\" (UniqueName: \"kubernetes.io/projected/c628b6de-fbba-4d3e-b47f-e3b271191168-kube-api-access-tn4kq\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:43:20.180358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.180324 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328a58f6-6331-4215-9f0a-fce75780582a" path="/var/lib/kubelet/pods/328a58f6-6331-4215-9f0a-fce75780582a/volumes" Apr 16 18:43:20.767610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.767577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" event={"ID":"c628b6de-fbba-4d3e-b47f-e3b271191168","Type":"ContainerDied","Data":"0dde2569f9341958da850bfcd31943443a4948ec982b27e0009794bdd9eaa24a"} Apr 16 18:43:20.767610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.767604 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj" Apr 16 18:43:20.768132 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.767621 2576 scope.go:117] "RemoveContainer" containerID="b941a023b4b38e6cd120e31bb77363b941104e2478e33c405431328f0e1a49f1" Apr 16 18:43:20.775518 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.775496 2576 scope.go:117] "RemoveContainer" containerID="6947f05b3adc5929ade5c5c3e72f9c802502a1c128552227d298b992c06a44c8" Apr 16 18:43:20.782540 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.782525 2576 scope.go:117] "RemoveContainer" containerID="81f478570f1322291197ecb7dcedfb8b47fa2683dea34bec081e0dd1bfe6d2af" Apr 16 18:43:20.786914 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.786893 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:43:20.789586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:20.789563 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf98lljj"] Apr 16 18:43:22.179372 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:22.179339 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" path="/var/lib/kubelet/pods/c628b6de-fbba-4d3e-b47f-e3b271191168/volumes" Apr 16 18:43:26.482140 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482365 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="main" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482375 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="main" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482386 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="storage-initializer" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482409 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="storage-initializer" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482434 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="storage-initializer" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482441 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="storage-initializer" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482450 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="main" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482456 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="main" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482462 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="tokenizer" Apr 16 18:43:26.482505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="tokenizer" Apr 16 18:43:26.482791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482519 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="328a58f6-6331-4215-9f0a-fce75780582a" containerName="main" Apr 16 18:43:26.482791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482529 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="tokenizer" Apr 16 18:43:26.482791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.482535 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c628b6de-fbba-4d3e-b47f-e3b271191168" containerName="main" Apr 16 18:43:26.487349 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.487331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.490385 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.490365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 18:43:26.495933 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.495914 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:43:26.578720 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.578720 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.578959 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.578959 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.578959 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7sgb\" (UniqueName: \"kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.578959 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.578947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.679994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.679939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.679994 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.679999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680251 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680036 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680251 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680251 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7sgb\" (UniqueName: \"kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680445 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680445 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680445 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.680589 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.680569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.682591 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.682571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.682871 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.682852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.689471 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.689451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7sgb\" (UniqueName: \"kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb\") pod \"precise-prefix-cache-test-kserve-d67dfdb78-njcvh\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.726160 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.726133 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:43:26.729754 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.729737 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.732427 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.732359 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-f4nfs\"" Apr 16 18:43:26.742863 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.742841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:43:26.798567 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.798533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:26.882318 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.882481 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.882481 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.882481 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57r9r\" (UniqueName: \"kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.882638 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.882638 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.882538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.946121 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.946095 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:43:26.947820 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:43:26.947786 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433cc5f7_67e2_43d6_8c41_fcd9eca85961.slice/crio-d1e925745d145755495d8cb485de010c7a7414ab8cfd95b3bc5b7c82de4cfa17 WatchSource:0}: Error finding container d1e925745d145755495d8cb485de010c7a7414ab8cfd95b3bc5b7c82de4cfa17: Status 404 returned error can't find the container with id d1e925745d145755495d8cb485de010c7a7414ab8cfd95b3bc5b7c82de4cfa17 Apr 16 18:43:26.983000 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.982936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983131 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983131 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983131 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57r9r\" (UniqueName: \"kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983283 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983283 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983411 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983474 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983612 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.983702 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.983649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.985804 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.985780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:26.997854 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:26.997834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57r9r\" (UniqueName: \"kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r\") pod \"precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:27.039578 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.039549 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:27.167759 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.167724 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:43:27.170756 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:43:27.170723 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf381abda_db1a_4ae9_8426_eb44f7a9957a.slice/crio-668ace76abf48bb69418d03da8d176985c0159cc472e1be421021cc26e49eafa WatchSource:0}: Error finding container 668ace76abf48bb69418d03da8d176985c0159cc472e1be421021cc26e49eafa: Status 404 returned error can't find the container with id 668ace76abf48bb69418d03da8d176985c0159cc472e1be421021cc26e49eafa Apr 16 18:43:27.791470 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.791430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerStarted","Data":"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763"} Apr 16 18:43:27.791896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.791477 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerStarted","Data":"668ace76abf48bb69418d03da8d176985c0159cc472e1be421021cc26e49eafa"} Apr 16 18:43:27.798593 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.797168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerStarted","Data":"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08"} Apr 16 18:43:27.798593 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:27.797209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerStarted","Data":"d1e925745d145755495d8cb485de010c7a7414ab8cfd95b3bc5b7c82de4cfa17"} Apr 16 18:43:28.801318 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:28.801283 2576 generic.go:358] "Generic (PLEG): container finished" podID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerID="c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763" exitCode=0 Apr 16 18:43:28.801743 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:28.801373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerDied","Data":"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763"} Apr 16 18:43:29.811684 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:29.811630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerStarted","Data":"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621"} Apr 16 18:43:29.811684 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:29.811678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerStarted","Data":"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106"} Apr 16 18:43:29.812183 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:29.811794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:29.836868 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:29.836815 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" podStartSLOduration=3.8367999839999998 podStartE2EDuration="3.836799984s" podCreationTimestamp="2026-04-16 18:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:43:29.834737184 +0000 UTC m=+784.301292084" watchObservedRunningTime="2026-04-16 18:43:29.836799984 +0000 UTC m=+784.303354862" Apr 16 18:43:31.821039 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:31.821005 2576 generic.go:358] "Generic (PLEG): container finished" podID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerID="cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08" exitCode=0 Apr 16 18:43:31.821505 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:31.821081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerDied","Data":"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08"} Apr 16 18:43:32.826731 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:32.826699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerStarted","Data":"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb"} Apr 16 18:43:32.865650 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:32.865569 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" podStartSLOduration=6.865547073 podStartE2EDuration="6.865547073s" podCreationTimestamp="2026-04-16 18:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:43:32.862815998 +0000 UTC m=+787.329370885" watchObservedRunningTime="2026-04-16 18:43:32.865547073 +0000 UTC m=+787.332101961" Apr 16 18:43:36.799644 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:36.799601 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:36.799644 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:36.799653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:36.812275 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:36.812254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:36.852372 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:36.852343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:43:37.040343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:37.040305 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:37.040343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:37.040356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:37.041617 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:43:37.041593 2576 logging.go:55] [core] [Channel #95 SubChannel #96]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.34:9003", ServerName: "10.132.0.34:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.34:9003: connect: connection refused" Apr 16 18:43:37.042855 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:37.042832 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:37.846069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:37.846041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:38.041176 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:38.041129 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.34:9003\" within 1s: context deadline exceeded" Apr 16 18:43:47.040930 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:43:47.040891 2576 logging.go:55] [core] [Channel #103 SubChannel #104]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.34:9003", ServerName: "10.132.0.34:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.34:9003: connect: connection refused" Apr 16 18:43:48.041462 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:48.041415 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.34:9003\" within 1s: context deadline exceeded" Apr 16 18:43:58.849299 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:58.849265 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:43:59.940158 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:59.940129 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:43:59.940645 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:59.940473 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="main" containerID="cri-o://9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb" gracePeriod=30 Apr 16 18:43:59.942051 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:59.942018 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:43:59.942444 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:59.942376 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" containerID="cri-o://8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106" gracePeriod=30 Apr 16 18:43:59.942648 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:43:59.942449 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="tokenizer" containerID="cri-o://33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621" gracePeriod=30 Apr 16 18:44:00.192265 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.192199 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:44:00.273509 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273474 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273509 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273516 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273605 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7sgb\" (UniqueName: \"kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273672 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home\") pod \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\" (UID: \"433cc5f7-67e2-43d6-8c41-fcd9eca85961\") " Apr 16 18:44:00.273973 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273816 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache" (OuterVolumeSpecName: "model-cache") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.273973 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.273963 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home" (OuterVolumeSpecName: "home") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.274076 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.274043 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-model-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.274076 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.274059 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-home\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.276327 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.276289 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb" (OuterVolumeSpecName: "kube-api-access-g7sgb") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "kube-api-access-g7sgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:44:00.276489 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.276361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:44:00.276489 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.276441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm" (OuterVolumeSpecName: "dshm") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.329735 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.329696 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "433cc5f7-67e2-43d6-8c41-fcd9eca85961" (UID: "433cc5f7-67e2-43d6-8c41-fcd9eca85961"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:00.375006 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.374972 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-dshm\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.375006 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.375004 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.375192 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.375017 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7sgb\" (UniqueName: \"kubernetes.io/projected/433cc5f7-67e2-43d6-8c41-fcd9eca85961-kube-api-access-g7sgb\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.375192 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.375031 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/433cc5f7-67e2-43d6-8c41-fcd9eca85961-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:00.924858 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.924822 2576 generic.go:358] "Generic (PLEG): container finished" podID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerID="8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106" exitCode=0 Apr 16 18:44:00.925048 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.924899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerDied","Data":"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106"} Apr 16 18:44:00.926360 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.926336 2576 generic.go:358] "Generic (PLEG): container finished" podID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerID="9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb" exitCode=0 Apr 16 18:44:00.926475 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.926431 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" Apr 16 18:44:00.926520 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.926425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerDied","Data":"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb"} Apr 16 18:44:00.926558 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.926542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh" event={"ID":"433cc5f7-67e2-43d6-8c41-fcd9eca85961","Type":"ContainerDied","Data":"d1e925745d145755495d8cb485de010c7a7414ab8cfd95b3bc5b7c82de4cfa17"} Apr 16 18:44:00.926591 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.926566 2576 scope.go:117] "RemoveContainer" containerID="9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb" Apr 16 18:44:00.936070 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.936035 2576 scope.go:117] "RemoveContainer" containerID="cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08" Apr 16 18:44:00.947357 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.947166 2576 scope.go:117] "RemoveContainer" containerID="9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb" Apr 16 18:44:00.947635 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:00.947513 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb\": container with ID starting with 9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb not found: ID does not exist" containerID="9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb" Apr 16 18:44:00.947635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.947543 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb"} err="failed to get container status \"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb\": rpc error: code = NotFound desc = could not find container \"9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb\": container with ID starting with 9b2b54f6e10e38da5e98011774452a18c087e2731b65be8b0a0918273a7e10eb not found: ID does not exist" Apr 16 18:44:00.947635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.947563 2576 scope.go:117] "RemoveContainer" containerID="cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08" Apr 16 18:44:00.947858 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:00.947833 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08\": container with ID starting with cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08 not found: ID does not exist" containerID="cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08" Apr 16 18:44:00.947906 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.947868 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08"} err="failed to get container status \"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08\": rpc error: code = NotFound desc = could not find container \"cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08\": container with ID starting with cd28f8e4523f1c6754586667b0c220fd8c573f240ba5aed5cbb3216e83ddcb08 not found: ID does not exist" Apr 16 18:44:00.948552 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.948535 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:44:00.952646 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:00.952624 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-d67dfdb78-njcvh"] Apr 16 18:44:01.396548 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.396524 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:44:01.485492 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485492 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485456 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485492 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57r9r\" (UniqueName: \"kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485513 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485549 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds\") pod \"f381abda-db1a-4ae9-8426-eb44f7a9957a\" (UID: \"f381abda-db1a-4ae9-8426-eb44f7a9957a\") " Apr 16 18:44:01.485919 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485797 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:01.485972 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.485923 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:01.486183 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.486030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:01.486368 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.486347 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:01.487987 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.487959 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:44:01.488083 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.488010 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r" (OuterVolumeSpecName: "kube-api-access-57r9r") pod "f381abda-db1a-4ae9-8426-eb44f7a9957a" (UID: "f381abda-db1a-4ae9-8426-eb44f7a9957a"). InnerVolumeSpecName "kube-api-access-57r9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:44:01.586387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586353 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f381abda-db1a-4ae9-8426-eb44f7a9957a-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.586387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586379 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.586387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586389 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.586387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586411 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.586387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586420 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57r9r\" (UniqueName: \"kubernetes.io/projected/f381abda-db1a-4ae9-8426-eb44f7a9957a-kube-api-access-57r9r\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.586690 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.586428 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f381abda-db1a-4ae9-8426-eb44f7a9957a-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:01.931311 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.931272 2576 generic.go:358] "Generic (PLEG): container finished" podID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerID="33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621" exitCode=0 Apr 16 18:44:01.931495 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.931356 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" Apr 16 18:44:01.931495 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.931352 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerDied","Data":"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621"} Apr 16 18:44:01.931600 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.931516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9" event={"ID":"f381abda-db1a-4ae9-8426-eb44f7a9957a","Type":"ContainerDied","Data":"668ace76abf48bb69418d03da8d176985c0159cc472e1be421021cc26e49eafa"} Apr 16 18:44:01.931600 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.931541 2576 scope.go:117] "RemoveContainer" containerID="33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621" Apr 16 18:44:01.940939 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.940917 2576 scope.go:117] "RemoveContainer" containerID="8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106" Apr 16 18:44:01.949927 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.949496 2576 scope.go:117] "RemoveContainer" containerID="c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763" Apr 16 18:44:01.955004 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.954980 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:44:01.957803 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.957782 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-54f664cfj8vr9"] Apr 16 18:44:01.961614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.961596 2576 scope.go:117] "RemoveContainer" containerID="33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621" Apr 16 18:44:01.961884 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:01.961866 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621\": container with ID starting with 33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621 not found: ID does not exist" containerID="33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621" Apr 16 18:44:01.961942 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.961892 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621"} err="failed to get container status \"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621\": rpc error: code = NotFound desc = could not find container \"33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621\": container with ID starting with 33ebcf552af1e6eeeab706450ac47b1c4139b92b83231e092a980f766fc26621 not found: ID does not exist" Apr 16 18:44:01.961942 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.961911 2576 scope.go:117] "RemoveContainer" containerID="8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106" Apr 16 18:44:01.962148 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:01.962129 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106\": container with ID starting with 8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106 not found: ID does not exist" containerID="8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106" Apr 16 18:44:01.962219 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.962159 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106"} err="failed to get container status \"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106\": rpc error: code = NotFound desc = could not find container \"8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106\": container with ID starting with 8d6933afbbf154dd46726f70617dc8e32eb3a639548459cb518fe6cfc4756106 not found: ID does not exist" Apr 16 18:44:01.962219 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.962182 2576 scope.go:117] "RemoveContainer" containerID="c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763" Apr 16 18:44:01.962438 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:01.962421 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763\": container with ID starting with c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763 not found: ID does not exist" containerID="c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763" Apr 16 18:44:01.962497 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:01.962447 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763"} err="failed to get container status \"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763\": rpc error: code = NotFound desc = could not find container \"c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763\": container with ID starting with c50f1831c9822b9d671918705726383c6bde2a54aad083a1f77b7e53b1b65763 not found: ID does not exist" Apr 16 18:44:02.179360 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:02.179325 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" path="/var/lib/kubelet/pods/433cc5f7-67e2-43d6-8c41-fcd9eca85961/volumes" Apr 16 18:44:02.179749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:02.179735 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" path="/var/lib/kubelet/pods/f381abda-db1a-4ae9-8426-eb44f7a9957a/volumes" Apr 16 18:44:57.362383 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:57.362287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:44:57.363053 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:57.362721 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="main" containerID="cri-o://b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea" gracePeriod=30 Apr 16 18:44:57.363053 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:57.362761 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="tokenizer" containerID="cri-o://10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2" gracePeriod=30 Apr 16 18:44:58.117186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.117150 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerID="b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea" exitCode=0 Apr 16 18:44:58.117365 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.117232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerDied","Data":"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea"} Apr 16 18:44:58.712747 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.712712 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:44:58.843993 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.843960 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844012 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844058 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgwkj\" (UniqueName: \"kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844178 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache\") pod \"d4cb74ed-fb93-49a8-8b42-687dab32388a\" (UID: \"d4cb74ed-fb93-49a8-8b42-687dab32388a\") " Apr 16 18:44:58.844481 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844430 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:58.844546 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844493 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:58.844546 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844497 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:58.844830 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.844810 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:58.846358 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.846333 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj" (OuterVolumeSpecName: "kube-api-access-xgwkj") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "kube-api-access-xgwkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:44:58.846434 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.846366 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d4cb74ed-fb93-49a8-8b42-687dab32388a" (UID: "d4cb74ed-fb93-49a8-8b42-687dab32388a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:44:58.945205 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945168 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:58.945205 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945197 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d4cb74ed-fb93-49a8-8b42-687dab32388a-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:58.945422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945209 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:58.945422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945232 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:58.945422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945241 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d4cb74ed-fb93-49a8-8b42-687dab32388a-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:58.945422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:58.945251 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xgwkj\" (UniqueName: \"kubernetes.io/projected/d4cb74ed-fb93-49a8-8b42-687dab32388a-kube-api-access-xgwkj\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:44:59.122272 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.122182 2576 generic.go:358] "Generic (PLEG): container finished" podID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerID="10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2" exitCode=0 Apr 16 18:44:59.122272 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.122235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerDied","Data":"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2"} Apr 16 18:44:59.122272 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.122269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" event={"ID":"d4cb74ed-fb93-49a8-8b42-687dab32388a","Type":"ContainerDied","Data":"b4fc7915334c6ef9e8192bd6171c5da5bf569d9557c7e750434f5e756b821ea5"} Apr 16 18:44:59.122549 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.122277 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq" Apr 16 18:44:59.122549 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.122290 2576 scope.go:117] "RemoveContainer" containerID="10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2" Apr 16 18:44:59.130814 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.130796 2576 scope.go:117] "RemoveContainer" containerID="b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea" Apr 16 18:44:59.138107 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.138090 2576 scope.go:117] "RemoveContainer" containerID="adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89" Apr 16 18:44:59.144715 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.144691 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:44:59.146051 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.146020 2576 scope.go:117] "RemoveContainer" containerID="10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2" Apr 16 18:44:59.146320 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:59.146298 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2\": container with ID starting with 10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2 not found: ID does not exist" containerID="10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2" Apr 16 18:44:59.146376 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.146329 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2"} err="failed to get container status \"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2\": rpc error: code = NotFound desc = could not find container \"10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2\": container with ID starting with 10d314be7cf7f1659c3b87fd22615a3d321104e1d0ad2633bbdd433798b3d6c2 not found: ID does not exist" Apr 16 18:44:59.146376 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.146347 2576 scope.go:117] "RemoveContainer" containerID="b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea" Apr 16 18:44:59.146709 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:59.146688 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea\": container with ID starting with b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea not found: ID does not exist" containerID="b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea" Apr 16 18:44:59.146790 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.146715 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea"} err="failed to get container status \"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea\": rpc error: code = NotFound desc = could not find container \"b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea\": container with ID starting with b80aa4213fecfd3edf3466f0f5d14256d60715c77972b8156645a03d34354bea not found: ID does not exist" Apr 16 18:44:59.146790 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.146739 2576 scope.go:117] "RemoveContainer" containerID="adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89" Apr 16 18:44:59.147048 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:44:59.147028 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89\": container with ID starting with adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89 not found: ID does not exist" containerID="adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89" Apr 16 18:44:59.147089 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.147053 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89"} err="failed to get container status \"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89\": rpc error: code = NotFound desc = could not find container \"adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89\": container with ID starting with adc6ce82319b9b41d4d71a6ec8f49f97e379a39ea5c64f19de27120c9cfd6f89 not found: ID does not exist" Apr 16 18:44:59.149842 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:44:59.149823 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schecsswq"] Apr 16 18:45:00.179244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:00.179210 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" path="/var/lib/kubelet/pods/d4cb74ed-fb93-49a8-8b42-687dab32388a/volumes" Apr 16 18:45:09.414667 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414589 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414907 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414940 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414946 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414958 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414964 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414970 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414975 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414987 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.414992 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415001 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415010 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415019 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415026 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415037 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415045 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="storage-initializer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415113 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415127 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f381abda-db1a-4ae9-8426-eb44f7a9957a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415138 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="main" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415148 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4cb74ed-fb93-49a8-8b42-687dab32388a" containerName="tokenizer" Apr 16 18:45:09.415172 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.415157 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="433cc5f7-67e2-43d6-8c41-fcd9eca85961" containerName="main" Apr 16 18:45:09.420286 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.420260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.423265 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.423241 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:45:09.423381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.423273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-zkdb8\"" Apr 16 18:45:09.423381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.423296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:45:09.423381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.423330 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 18:45:09.425203 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.425185 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:45:09.431075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.431045 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:45:09.531104 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.531249 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.531249 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.531359 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.531359 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.531359 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.531348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8jd\" (UniqueName: \"kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.631953 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.631915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.631975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8jd\" (UniqueName: \"kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632124 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632370 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632467 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632529 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632529 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.632626 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.632574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.634679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.634662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.640846 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.640821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8jd\" (UniqueName: \"kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd\") pod \"custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.749113 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.749022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:09.879836 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:09.879811 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:45:09.881600 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:45:09.881573 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59900019_3a7e_4dee_846d_eb55d4585745.slice/crio-96e4d4022de15b5a932482590e941a1e42b27926a0fa0d9c18932ebc851bc07c WatchSource:0}: Error finding container 96e4d4022de15b5a932482590e941a1e42b27926a0fa0d9c18932ebc851bc07c: Status 404 returned error can't find the container with id 96e4d4022de15b5a932482590e941a1e42b27926a0fa0d9c18932ebc851bc07c Apr 16 18:45:10.158558 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:10.158518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerStarted","Data":"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675"} Apr 16 18:45:10.158558 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:10.158558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerStarted","Data":"96e4d4022de15b5a932482590e941a1e42b27926a0fa0d9c18932ebc851bc07c"} Apr 16 18:45:11.163002 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:11.162964 2576 generic.go:358] "Generic (PLEG): container finished" podID="59900019-3a7e-4dee-846d-eb55d4585745" containerID="2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675" exitCode=0 Apr 16 18:45:11.163422 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:11.163055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerDied","Data":"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675"} Apr 16 18:45:12.168586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:12.168550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerStarted","Data":"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb"} Apr 16 18:45:12.168586 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:12.168591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerStarted","Data":"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27"} Apr 16 18:45:12.169013 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:12.168726 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:12.192524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:12.192464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" podStartSLOduration=3.192446012 podStartE2EDuration="3.192446012s" podCreationTimestamp="2026-04-16 18:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:45:12.191701014 +0000 UTC m=+886.658255904" watchObservedRunningTime="2026-04-16 18:45:12.192446012 +0000 UTC m=+886.659000899" Apr 16 18:45:19.749567 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:19.749532 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:19.749567 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:19.749567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:19.751963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:19.751940 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:20.195937 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:20.195902 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:45:26.112768 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:26.112722 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:45:26.114260 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:26.114237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:45:41.199875 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:45:41.199841 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:46:05.682532 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.682461 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:46:05.685797 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.685777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.688862 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.688840 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-zwbjr\"" Apr 16 18:46:05.689463 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.689446 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 18:46:05.700814 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.700792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:46:05.803683 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803645 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.803848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.803848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.803848 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803833 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.803989 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jshkx\" (UniqueName: \"kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.803989 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.803920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905343 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jshkx\" (UniqueName: \"kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905798 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905874 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905874 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.905948 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.905882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.908225 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.908199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.914097 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.914075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jshkx\" (UniqueName: \"kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:05.997079 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:05.996991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:06.128159 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:06.128129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:46:06.130740 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:46:06.130711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b3688d_767a_4065_81f5_cb98c233ec36.slice/crio-8f22af7787b0528ee24614c73899e38b1771a8bf88bd4b6de67e14e796ceec21 WatchSource:0}: Error finding container 8f22af7787b0528ee24614c73899e38b1771a8bf88bd4b6de67e14e796ceec21: Status 404 returned error can't find the container with id 8f22af7787b0528ee24614c73899e38b1771a8bf88bd4b6de67e14e796ceec21 Apr 16 18:46:06.331598 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:06.331558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerStarted","Data":"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e"} Apr 16 18:46:06.331807 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:06.331606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerStarted","Data":"8f22af7787b0528ee24614c73899e38b1771a8bf88bd4b6de67e14e796ceec21"} Apr 16 18:46:07.337191 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:07.337150 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerID="27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e" exitCode=0 Apr 16 18:46:07.337589 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:07.337208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerDied","Data":"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e"} Apr 16 18:46:08.343204 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:08.343170 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerStarted","Data":"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f"} Apr 16 18:46:08.343204 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:08.343205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerStarted","Data":"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b"} Apr 16 18:46:08.343626 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:08.343294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:08.365423 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:08.365366 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" podStartSLOduration=3.365351807 podStartE2EDuration="3.365351807s" podCreationTimestamp="2026-04-16 18:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:46:08.36504758 +0000 UTC m=+942.831602479" watchObservedRunningTime="2026-04-16 18:46:08.365351807 +0000 UTC m=+942.831906693" Apr 16 18:46:15.997963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:15.997924 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:15.997963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:15.997974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:16.000548 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:16.000524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:16.373117 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:16.373088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:46:37.376840 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:46:37.376810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:47:06.042354 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:06.042316 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:47:06.042736 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:06.042620 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="main" containerID="cri-o://5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27" gracePeriod=30 Apr 16 18:47:06.042736 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:06.042660 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="tokenizer" containerID="cri-o://38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb" gracePeriod=30 Apr 16 18:47:06.542716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:06.542679 2576 generic.go:358] "Generic (PLEG): container finished" podID="59900019-3a7e-4dee-846d-eb55d4585745" containerID="5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27" exitCode=0 Apr 16 18:47:06.542896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:06.542735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerDied","Data":"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27"} Apr 16 18:47:07.211308 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.211284 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:47:07.316798 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316772 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.316944 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316853 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.316944 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316874 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n8jd\" (UniqueName: \"kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.316944 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.316944 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316936 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.317153 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.316975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp\") pod \"59900019-3a7e-4dee-846d-eb55d4585745\" (UID: \"59900019-3a7e-4dee-846d-eb55d4585745\") " Apr 16 18:47:07.317153 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.317069 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:07.317259 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.317205 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:07.317259 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.317234 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.317435 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.317329 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:07.318424 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.318371 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:07.319194 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.319170 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd" (OuterVolumeSpecName: "kube-api-access-6n8jd") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "kube-api-access-6n8jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:47:07.319319 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.319301 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "59900019-3a7e-4dee-846d-eb55d4585745" (UID: "59900019-3a7e-4dee-846d-eb55d4585745"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:47:07.418162 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.418135 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/59900019-3a7e-4dee-846d-eb55d4585745-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.418162 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.418161 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n8jd\" (UniqueName: \"kubernetes.io/projected/59900019-3a7e-4dee-846d-eb55d4585745-kube-api-access-6n8jd\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.418336 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.418172 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.418336 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.418182 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.418336 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.418191 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/59900019-3a7e-4dee-846d-eb55d4585745-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:07.547254 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.547219 2576 generic.go:358] "Generic (PLEG): container finished" podID="59900019-3a7e-4dee-846d-eb55d4585745" containerID="38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb" exitCode=0 Apr 16 18:47:07.547426 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.547302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerDied","Data":"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb"} Apr 16 18:47:07.547426 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.547341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" event={"ID":"59900019-3a7e-4dee-846d-eb55d4585745","Type":"ContainerDied","Data":"96e4d4022de15b5a932482590e941a1e42b27926a0fa0d9c18932ebc851bc07c"} Apr 16 18:47:07.547426 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.547359 2576 scope.go:117] "RemoveContainer" containerID="38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb" Apr 16 18:47:07.547426 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.547311 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c" Apr 16 18:47:07.556003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.555987 2576 scope.go:117] "RemoveContainer" containerID="5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27" Apr 16 18:47:07.563130 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.563115 2576 scope.go:117] "RemoveContainer" containerID="2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675" Apr 16 18:47:07.570292 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.570241 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:47:07.570494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.570369 2576 scope.go:117] "RemoveContainer" containerID="38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb" Apr 16 18:47:07.570687 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:07.570665 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb\": container with ID starting with 38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb not found: ID does not exist" containerID="38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb" Apr 16 18:47:07.570738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.570698 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb"} err="failed to get container status \"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb\": rpc error: code = NotFound desc = could not find container \"38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb\": container with ID starting with 38afd9844dfa55758a4aadbf113e6a8dff0209aa03031b1fe07870f05881e7cb not found: ID does not exist" Apr 16 18:47:07.570738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.570722 2576 scope.go:117] "RemoveContainer" containerID="5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27" Apr 16 18:47:07.570992 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:07.570971 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27\": container with ID starting with 5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27 not found: ID does not exist" containerID="5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27" Apr 16 18:47:07.571065 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.571000 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27"} err="failed to get container status \"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27\": rpc error: code = NotFound desc = could not find container \"5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27\": container with ID starting with 5958c9db39359424064370cbcb82421ff345db3c7a58d8389b0f2d02b7021b27 not found: ID does not exist" Apr 16 18:47:07.571065 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.571022 2576 scope.go:117] "RemoveContainer" containerID="2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675" Apr 16 18:47:07.571252 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:07.571235 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675\": container with ID starting with 2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675 not found: ID does not exist" containerID="2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675" Apr 16 18:47:07.571310 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.571258 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675"} err="failed to get container status \"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675\": rpc error: code = NotFound desc = could not find container \"2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675\": container with ID starting with 2073d453464ade1578cc3ddffeb6fc727f8df2eab07ede115401c2fe25369675 not found: ID does not exist" Apr 16 18:47:07.574572 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:07.574554 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6c56dfc47r65c"] Apr 16 18:47:08.179795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:08.179760 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59900019-3a7e-4dee-846d-eb55d4585745" path="/var/lib/kubelet/pods/59900019-3a7e-4dee-846d-eb55d4585745/volumes" Apr 16 18:47:23.149727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.149690 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150000 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="main" Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150010 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="main" Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150043 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="storage-initializer" Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150049 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="storage-initializer" Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="tokenizer" Apr 16 18:47:23.150088 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150068 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="tokenizer" Apr 16 18:47:23.150268 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="main" Apr 16 18:47:23.150268 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.150131 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="59900019-3a7e-4dee-846d-eb55d4585745" containerName="tokenizer" Apr 16 18:47:23.153251 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.153235 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.157009 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.156989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-pnsjn\"" Apr 16 18:47:23.157138 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.157119 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 18:47:23.166621 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.166600 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:47:23.344352 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.344551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.344551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.344551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.344551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhp2j\" (UniqueName: \"kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.344685 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.344579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446060 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.445977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446060 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhp2j\" (UniqueName: \"kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446511 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446595 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446644 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.446644 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.446631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.448681 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.448662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.454763 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.454744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhp2j\" (UniqueName: \"kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j\") pod \"router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.462833 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.462816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:23.591356 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.591329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:47:23.593815 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:47:23.593783 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e208cc_0489_440d_9605_c3f0011d0657.slice/crio-475c5027dc9f55facbb2039e3e2fbaef35a6d3953b5e27ba74b963d306fc5f4d WatchSource:0}: Error finding container 475c5027dc9f55facbb2039e3e2fbaef35a6d3953b5e27ba74b963d306fc5f4d: Status 404 returned error can't find the container with id 475c5027dc9f55facbb2039e3e2fbaef35a6d3953b5e27ba74b963d306fc5f4d Apr 16 18:47:23.595765 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.595744 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:47:23.598297 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:23.598270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerStarted","Data":"475c5027dc9f55facbb2039e3e2fbaef35a6d3953b5e27ba74b963d306fc5f4d"} Apr 16 18:47:24.604125 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:24.604085 2576 generic.go:358] "Generic (PLEG): container finished" podID="77e208cc-0489-440d-9605-c3f0011d0657" containerID="8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e" exitCode=0 Apr 16 18:47:24.604528 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:24.604165 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerDied","Data":"8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e"} Apr 16 18:47:25.611105 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:25.611066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerStarted","Data":"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754"} Apr 16 18:47:25.611105 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:25.611110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerStarted","Data":"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1"} Apr 16 18:47:25.611575 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:25.611206 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:25.633606 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:25.633554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" podStartSLOduration=2.63351319 podStartE2EDuration="2.63351319s" podCreationTimestamp="2026-04-16 18:47:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:47:25.632166499 +0000 UTC m=+1020.098721398" watchObservedRunningTime="2026-04-16 18:47:25.63351319 +0000 UTC m=+1020.100068077" Apr 16 18:47:33.463070 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:33.463029 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:33.463070 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:33.463075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:33.465944 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:33.465919 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:33.639497 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:33.639472 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:54.644791 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:54.644757 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:47:56.977998 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:56.977960 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:47:56.978439 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:56.978356 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="main" containerID="cri-o://1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b" gracePeriod=30 Apr 16 18:47:56.978594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:56.978458 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="tokenizer" containerID="cri-o://676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f" gracePeriod=30 Apr 16 18:47:57.376418 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:47:57.376371 2576 logging.go:55] [core] [Channel #282 SubChannel #283]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.36:9003", ServerName: "10.132.0.36:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.36:9003: connect: connection refused" Apr 16 18:47:57.722827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:57.722737 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerID="1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b" exitCode=0 Apr 16 18:47:57.722976 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:57.722816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerDied","Data":"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b"} Apr 16 18:47:58.137999 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.137974 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:47:58.227460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227444 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227488 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227541 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227559 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jshkx\" (UniqueName: \"kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx\") pod \"f8b3688d-767a-4065-81f5-cb98c233ec36\" (UID: \"f8b3688d-767a-4065-81f5-cb98c233ec36\") " Apr 16 18:47:58.227709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227667 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:58.227709 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:58.228023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227718 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:58.228023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227890 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.228023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227911 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.228023 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.227925 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.228350 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.228326 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:58.229886 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.229862 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:47:58.230033 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.230013 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx" (OuterVolumeSpecName: "kube-api-access-jshkx") pod "f8b3688d-767a-4065-81f5-cb98c233ec36" (UID: "f8b3688d-767a-4065-81f5-cb98c233ec36"). InnerVolumeSpecName "kube-api-access-jshkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:47:58.328723 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.328683 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b3688d-767a-4065-81f5-cb98c233ec36-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.328723 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.328721 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8b3688d-767a-4065-81f5-cb98c233ec36-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.328723 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.328733 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jshkx\" (UniqueName: \"kubernetes.io/projected/f8b3688d-767a-4065-81f5-cb98c233ec36-kube-api-access-jshkx\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:47:58.376551 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.376514 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.36:9003\" within 1s: context deadline exceeded" Apr 16 18:47:58.727457 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.727420 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerID="676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f" exitCode=0 Apr 16 18:47:58.727633 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.727493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerDied","Data":"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f"} Apr 16 18:47:58.727633 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.727528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" event={"ID":"f8b3688d-767a-4065-81f5-cb98c233ec36","Type":"ContainerDied","Data":"8f22af7787b0528ee24614c73899e38b1771a8bf88bd4b6de67e14e796ceec21"} Apr 16 18:47:58.727633 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.727533 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq" Apr 16 18:47:58.727633 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.727548 2576 scope.go:117] "RemoveContainer" containerID="676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f" Apr 16 18:47:58.736711 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.736691 2576 scope.go:117] "RemoveContainer" containerID="1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b" Apr 16 18:47:58.744067 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.744051 2576 scope.go:117] "RemoveContainer" containerID="27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e" Apr 16 18:47:58.751032 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751015 2576 scope.go:117] "RemoveContainer" containerID="676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f" Apr 16 18:47:58.751266 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:58.751247 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f\": container with ID starting with 676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f not found: ID does not exist" containerID="676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f" Apr 16 18:47:58.751314 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751275 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f"} err="failed to get container status \"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f\": rpc error: code = NotFound desc = could not find container \"676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f\": container with ID starting with 676c16a4f81749003f95977094485acc4cd11b744de57f376ccec5e0b446475f not found: ID does not exist" Apr 16 18:47:58.751314 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751293 2576 scope.go:117] "RemoveContainer" containerID="1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b" Apr 16 18:47:58.751516 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:58.751501 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b\": container with ID starting with 1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b not found: ID does not exist" containerID="1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b" Apr 16 18:47:58.751563 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751521 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b"} err="failed to get container status \"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b\": rpc error: code = NotFound desc = could not find container \"1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b\": container with ID starting with 1586d42f9f910030aa62f04a5dd7e22ac32e421d948090c5e140a4d08c29c11b not found: ID does not exist" Apr 16 18:47:58.751563 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751534 2576 scope.go:117] "RemoveContainer" containerID="27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e" Apr 16 18:47:58.751709 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:47:58.751694 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e\": container with ID starting with 27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e not found: ID does not exist" containerID="27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e" Apr 16 18:47:58.751748 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.751710 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e"} err="failed to get container status \"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e\": rpc error: code = NotFound desc = could not find container \"27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e\": container with ID starting with 27b9f30da36b14b5b75a959cba8e720ab4567cbecfb45f6155c560905d52798e not found: ID does not exist" Apr 16 18:47:58.771321 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.771299 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:47:58.786168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:47:58.786145 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-kvsmq"] Apr 16 18:48:00.179729 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:48:00.179700 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" path="/var/lib/kubelet/pods/f8b3688d-767a-4065-81f5-cb98c233ec36/volumes" Apr 16 18:49:24.643878 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.643838 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 18:49:24.674955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.674925 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:49:24.675305 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.675276 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="main" containerID="cri-o://234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1" gracePeriod=30 Apr 16 18:49:24.675545 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.675337 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="tokenizer" containerID="cri-o://916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754" gracePeriod=30 Apr 16 18:49:24.989885 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.989785 2576 generic.go:358] "Generic (PLEG): container finished" podID="77e208cc-0489-440d-9605-c3f0011d0657" containerID="234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1" exitCode=0 Apr 16 18:49:24.989885 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:24.989850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerDied","Data":"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1"} Apr 16 18:49:25.822751 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.822728 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:49:25.953668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.953668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953616 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.953668 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhp2j\" (UniqueName: \"kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.953942 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953680 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.953942 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953722 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.953942 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs\") pod \"77e208cc-0489-440d-9605-c3f0011d0657\" (UID: \"77e208cc-0489-440d-9605-c3f0011d0657\") " Apr 16 18:49:25.954095 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:25.954095 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.953993 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:25.954095 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.954014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:25.954495 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.954376 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:25.955921 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.955893 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:49:25.956013 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.955946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j" (OuterVolumeSpecName: "kube-api-access-mhp2j") pod "77e208cc-0489-440d-9605-c3f0011d0657" (UID: "77e208cc-0489-440d-9605-c3f0011d0657"). InnerVolumeSpecName "kube-api-access-mhp2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:49:25.994738 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.994713 2576 generic.go:358] "Generic (PLEG): container finished" podID="77e208cc-0489-440d-9605-c3f0011d0657" containerID="916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754" exitCode=0 Apr 16 18:49:25.994837 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.994796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerDied","Data":"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754"} Apr 16 18:49:25.994837 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.994819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" event={"ID":"77e208cc-0489-440d-9605-c3f0011d0657","Type":"ContainerDied","Data":"475c5027dc9f55facbb2039e3e2fbaef35a6d3953b5e27ba74b963d306fc5f4d"} Apr 16 18:49:25.994907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.994835 2576 scope.go:117] "RemoveContainer" containerID="916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754" Apr 16 18:49:25.994907 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:25.994846 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj" Apr 16 18:49:26.003142 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.003127 2576 scope.go:117] "RemoveContainer" containerID="234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1" Apr 16 18:49:26.010306 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.010289 2576 scope.go:117] "RemoveContainer" containerID="8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e" Apr 16 18:49:26.018181 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.018162 2576 scope.go:117] "RemoveContainer" containerID="916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754" Apr 16 18:49:26.018597 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:49:26.018466 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754\": container with ID starting with 916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754 not found: ID does not exist" containerID="916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754" Apr 16 18:49:26.018597 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.018512 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754"} err="failed to get container status \"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754\": rpc error: code = NotFound desc = could not find container \"916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754\": container with ID starting with 916c8c23d0372d97b1fce32e32dd3c119d0a20a431bfeb15eb2f775978eca754 not found: ID does not exist" Apr 16 18:49:26.018597 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.018534 2576 scope.go:117] "RemoveContainer" containerID="234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1" Apr 16 18:49:26.018862 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:49:26.018843 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1\": container with ID starting with 234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1 not found: ID does not exist" containerID="234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1" Apr 16 18:49:26.018905 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.018872 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1"} err="failed to get container status \"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1\": rpc error: code = NotFound desc = could not find container \"234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1\": container with ID starting with 234e8113178186f8a801b2720596fbe0e03768fb13cb3feaad6714db5bc0e8e1 not found: ID does not exist" Apr 16 18:49:26.018905 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.018889 2576 scope.go:117] "RemoveContainer" containerID="8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e" Apr 16 18:49:26.019150 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:49:26.019130 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e\": container with ID starting with 8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e not found: ID does not exist" containerID="8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e" Apr 16 18:49:26.019213 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.019154 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e"} err="failed to get container status \"8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e\": rpc error: code = NotFound desc = could not find container \"8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e\": container with ID starting with 8c7f41e89993326755ece211bdff89c5517dab47c10c1ef4bc918280bfef4a0e not found: ID does not exist" Apr 16 18:49:26.020241 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.020222 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:49:26.024243 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.024219 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-8cdf76d65-b9klj"] Apr 16 18:49:26.054669 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054647 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.054669 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054668 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/77e208cc-0489-440d-9605-c3f0011d0657-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.054788 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054680 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.054788 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054689 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.054788 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054699 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mhp2j\" (UniqueName: \"kubernetes.io/projected/77e208cc-0489-440d-9605-c3f0011d0657-kube-api-access-mhp2j\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.054788 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.054707 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/77e208cc-0489-440d-9605-c3f0011d0657-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:49:26.179187 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:26.179158 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77e208cc-0489-440d-9605-c3f0011d0657" path="/var/lib/kubelet/pods/77e208cc-0489-440d-9605-c3f0011d0657/volumes" Apr 16 18:49:30.184077 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184321 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184334 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184345 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184365 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184378 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="main" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184383 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="main" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184409 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="storage-initializer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184415 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="storage-initializer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184422 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="main" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184427 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="main" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184441 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="storage-initializer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184447 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="storage-initializer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184496 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184505 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="tokenizer" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184514 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="77e208cc-0489-440d-9605-c3f0011d0657" containerName="main" Apr 16 18:49:30.184556 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.184520 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b3688d-767a-4065-81f5-cb98c233ec36" containerName="main" Apr 16 18:49:30.188795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.188778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.191716 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.191692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:49:30.191826 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.191809 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:49:30.192747 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.192685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:49:30.192867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.192751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-4prhp\"" Apr 16 18:49:30.192867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.192781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 18:49:30.202465 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.202443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:49:30.290291 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.290462 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcjs2\" (UniqueName: \"kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.290462 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.290462 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.290573 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.290573 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.290546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391346 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcjs2\" (UniqueName: \"kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391620 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.391903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.392027 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.391930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.392131 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.392111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.392168 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.392125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.394190 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.394167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.402509 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.402489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcjs2\" (UniqueName: \"kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.497901 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.497827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:30.641898 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:30.641870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:49:30.644057 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:49:30.644024 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod396d049d_4242_4d80_a435_418df6a95230.slice/crio-d0ed2d21697dc9033255c838b8801376f3f589d88854ea75ec31d391161aa464 WatchSource:0}: Error finding container d0ed2d21697dc9033255c838b8801376f3f589d88854ea75ec31d391161aa464: Status 404 returned error can't find the container with id d0ed2d21697dc9033255c838b8801376f3f589d88854ea75ec31d391161aa464 Apr 16 18:49:31.010080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:31.010043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerStarted","Data":"e2220c955017bc8ea88435c24878ec474ae226e8b8b70f16e2c889a167fe5c94"} Apr 16 18:49:31.010080 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:31.010085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerStarted","Data":"d0ed2d21697dc9033255c838b8801376f3f589d88854ea75ec31d391161aa464"} Apr 16 18:49:32.014220 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:32.014188 2576 generic.go:358] "Generic (PLEG): container finished" podID="396d049d-4242-4d80-a435-418df6a95230" containerID="e2220c955017bc8ea88435c24878ec474ae226e8b8b70f16e2c889a167fe5c94" exitCode=0 Apr 16 18:49:32.014663 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:32.014285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerDied","Data":"e2220c955017bc8ea88435c24878ec474ae226e8b8b70f16e2c889a167fe5c94"} Apr 16 18:49:33.020072 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:33.020035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerStarted","Data":"3503a7615fe90d5c72591ef02ca95f3db382fc72aafce59275676eb03587bd1c"} Apr 16 18:49:33.020548 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:33.020079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerStarted","Data":"d2a9424a0f7551adb2c59033393535585f0b590cb7e35da3c301c22f7cf0aa14"} Apr 16 18:49:33.020548 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:33.020122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:33.043896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:33.043856 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" podStartSLOduration=3.04383966 podStartE2EDuration="3.04383966s" podCreationTimestamp="2026-04-16 18:49:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:33.042120327 +0000 UTC m=+1147.508675213" watchObservedRunningTime="2026-04-16 18:49:33.04383966 +0000 UTC m=+1147.510394550" Apr 16 18:49:40.498778 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:40.498688 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:40.498778 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:40.498747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:40.501472 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:40.501445 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:49:41.049891 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:49:41.049865 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:50:02.053728 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:50:02.053697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:50:26.135248 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:50:26.135215 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:50:26.136635 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:50:26.136603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:52:13.025064 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.025029 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:52:13.028377 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.028360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.031414 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.031371 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-mt8hm\"" Apr 16 18:52:13.031553 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.031532 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 18:52:13.040167 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.040138 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:52:13.132038 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.131991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.132244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.132058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.132244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.132128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.132244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.132164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6p7d\" (UniqueName: \"kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.132244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.132191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.132244 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.132238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.232838 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6p7d\" (UniqueName: \"kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.232988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233311 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.233270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233311 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.233302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233387 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.233369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.233469 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.233454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.235588 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.235571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.241843 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.241816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6p7d\" (UniqueName: \"kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.339781 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.339740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:13.473649 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.473619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:52:13.475812 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:52:13.475778 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d8a0bb9_1a16_4f8c_9687_a8725855dbb0.slice/crio-6ffed7d7e00c8c46d6e500654d8297ba7fd25ec75b2489c13fee6ad5a9e956ab WatchSource:0}: Error finding container 6ffed7d7e00c8c46d6e500654d8297ba7fd25ec75b2489c13fee6ad5a9e956ab: Status 404 returned error can't find the container with id 6ffed7d7e00c8c46d6e500654d8297ba7fd25ec75b2489c13fee6ad5a9e956ab Apr 16 18:52:13.514583 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:13.514548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerStarted","Data":"6ffed7d7e00c8c46d6e500654d8297ba7fd25ec75b2489c13fee6ad5a9e956ab"} Apr 16 18:52:14.520222 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:14.520179 2576 generic.go:358] "Generic (PLEG): container finished" podID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerID="b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8" exitCode=0 Apr 16 18:52:14.520637 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:14.520262 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerDied","Data":"b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8"} Apr 16 18:52:15.526223 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:15.526186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerStarted","Data":"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029"} Apr 16 18:52:15.526223 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:15.526224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerStarted","Data":"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099"} Apr 16 18:52:15.526691 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:15.526316 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:15.547955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:15.547898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" podStartSLOduration=2.547880487 podStartE2EDuration="2.547880487s" podCreationTimestamp="2026-04-16 18:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:52:15.546456225 +0000 UTC m=+1310.013011110" watchObservedRunningTime="2026-04-16 18:52:15.547880487 +0000 UTC m=+1310.014435382" Apr 16 18:52:23.340897 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:23.340859 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:23.340897 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:23.340904 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:23.343712 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:23.343686 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:23.556776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:23.556739 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:43.360878 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:43.360780 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:52:43.361484 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:43.361190 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="main" containerID="cri-o://d2a9424a0f7551adb2c59033393535585f0b590cb7e35da3c301c22f7cf0aa14" gracePeriod=30 Apr 16 18:52:43.361484 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:43.361242 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="tokenizer" containerID="cri-o://3503a7615fe90d5c72591ef02ca95f3db382fc72aafce59275676eb03587bd1c" gracePeriod=30 Apr 16 18:52:43.635116 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:43.635029 2576 generic.go:358] "Generic (PLEG): container finished" podID="396d049d-4242-4d80-a435-418df6a95230" containerID="d2a9424a0f7551adb2c59033393535585f0b590cb7e35da3c301c22f7cf0aa14" exitCode=0 Apr 16 18:52:43.635116 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:43.635098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerDied","Data":"d2a9424a0f7551adb2c59033393535585f0b590cb7e35da3c301c22f7cf0aa14"} Apr 16 18:52:44.560698 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.560672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:52:44.640341 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.640308 2576 generic.go:358] "Generic (PLEG): container finished" podID="396d049d-4242-4d80-a435-418df6a95230" containerID="3503a7615fe90d5c72591ef02ca95f3db382fc72aafce59275676eb03587bd1c" exitCode=0 Apr 16 18:52:44.640516 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.640381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerDied","Data":"3503a7615fe90d5c72591ef02ca95f3db382fc72aafce59275676eb03587bd1c"} Apr 16 18:52:44.716083 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.716060 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:52:44.809403 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809570 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809570 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809504 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcjs2\" (UniqueName: \"kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809570 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809554 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:44.809890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809802 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache\") pod \"396d049d-4242-4d80-a435-418df6a95230\" (UID: \"396d049d-4242-4d80-a435-418df6a95230\") " Apr 16 18:52:44.809890 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.809873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:44.810182 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.810039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:44.810182 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.810145 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:44.810182 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.810156 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:44.810182 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.810165 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:44.810596 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.810510 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:44.811859 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.811834 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:52:44.811955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.811860 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2" (OuterVolumeSpecName: "kube-api-access-rcjs2") pod "396d049d-4242-4d80-a435-418df6a95230" (UID: "396d049d-4242-4d80-a435-418df6a95230"). InnerVolumeSpecName "kube-api-access-rcjs2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:52:44.910924 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.910883 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/396d049d-4242-4d80-a435-418df6a95230-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:44.910924 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.910919 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/396d049d-4242-4d80-a435-418df6a95230-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:44.910924 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:44.910931 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcjs2\" (UniqueName: \"kubernetes.io/projected/396d049d-4242-4d80-a435-418df6a95230-kube-api-access-rcjs2\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:52:45.645488 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.645455 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" Apr 16 18:52:45.645903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.645455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8" event={"ID":"396d049d-4242-4d80-a435-418df6a95230","Type":"ContainerDied","Data":"d0ed2d21697dc9033255c838b8801376f3f589d88854ea75ec31d391161aa464"} Apr 16 18:52:45.645903 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.645579 2576 scope.go:117] "RemoveContainer" containerID="3503a7615fe90d5c72591ef02ca95f3db382fc72aafce59275676eb03587bd1c" Apr 16 18:52:45.654262 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.654242 2576 scope.go:117] "RemoveContainer" containerID="d2a9424a0f7551adb2c59033393535585f0b590cb7e35da3c301c22f7cf0aa14" Apr 16 18:52:45.661647 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.661626 2576 scope.go:117] "RemoveContainer" containerID="e2220c955017bc8ea88435c24878ec474ae226e8b8b70f16e2c889a167fe5c94" Apr 16 18:52:45.667060 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.667036 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:52:45.670460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:45.670440 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schen2wb8"] Apr 16 18:52:46.179893 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:52:46.179858 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396d049d-4242-4d80-a435-418df6a95230" path="/var/lib/kubelet/pods/396d049d-4242-4d80-a435-418df6a95230/volumes" Apr 16 18:53:04.975213 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975173 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975527 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="tokenizer" Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975539 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="tokenizer" Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="main" Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975560 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="main" Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975567 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="storage-initializer" Apr 16 18:53:04.975624 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975573 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="storage-initializer" Apr 16 18:53:04.975859 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975636 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="main" Apr 16 18:53:04.975859 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.975644 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="396d049d-4242-4d80-a435-418df6a95230" containerName="tokenizer" Apr 16 18:53:04.980278 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.980254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:04.982758 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.982735 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-bjp8m\"" Apr 16 18:53:04.982888 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.982738 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 18:53:04.991734 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:04.991712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:53:05.074769 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.074736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.074955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.074777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.074955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.074800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnb7t\" (UniqueName: \"kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.074955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.074883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.074955 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.074920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.075228 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.075032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.175964 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.175918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.175964 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.175966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176221 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.175995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176221 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnb7t\" (UniqueName: \"kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176221 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176221 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176388 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176508 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176508 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.176584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.176539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.178790 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.178769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.184114 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.184088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnb7t\" (UniqueName: \"kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.291659 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.291553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:05.421671 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.421631 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:53:05.424223 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:53:05.424195 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7faa08_a15d_4c4e_8a77_f838669b928e.slice/crio-5149345b99db21fc589c393b7c27aca6437c083eb3ee5a76a13e073a07838e72 WatchSource:0}: Error finding container 5149345b99db21fc589c393b7c27aca6437c083eb3ee5a76a13e073a07838e72: Status 404 returned error can't find the container with id 5149345b99db21fc589c393b7c27aca6437c083eb3ee5a76a13e073a07838e72 Apr 16 18:53:05.426453 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.426438 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:53:05.719871 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.719827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerStarted","Data":"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737"} Apr 16 18:53:05.720044 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:05.719882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerStarted","Data":"5149345b99db21fc589c393b7c27aca6437c083eb3ee5a76a13e073a07838e72"} Apr 16 18:53:06.723910 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:06.723874 2576 generic.go:358] "Generic (PLEG): container finished" podID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerID="49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737" exitCode=0 Apr 16 18:53:06.724289 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:06.723945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerDied","Data":"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737"} Apr 16 18:53:07.729841 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:07.729802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerStarted","Data":"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14"} Apr 16 18:53:07.729841 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:07.729843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerStarted","Data":"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883"} Apr 16 18:53:07.730288 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:07.729974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:07.752415 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:07.752342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" podStartSLOduration=3.7523239840000002 podStartE2EDuration="3.752323984s" podCreationTimestamp="2026-04-16 18:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:53:07.751444843 +0000 UTC m=+1362.217999729" watchObservedRunningTime="2026-04-16 18:53:07.752323984 +0000 UTC m=+1362.218878871" Apr 16 18:53:15.292274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:15.292230 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:15.292274 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:15.292283 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:15.294954 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:15.294932 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:15.757203 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:15.757173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:53:36.760970 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:53:36.760937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:55:26.158790 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:26.158755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:55:26.161410 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:26.161370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 18:55:49.531104 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:49.531012 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:55:49.531598 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:49.531352 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="main" containerID="cri-o://f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099" gracePeriod=30 Apr 16 18:55:49.531598 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:49.531428 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="tokenizer" containerID="cri-o://a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029" gracePeriod=30 Apr 16 18:55:50.273516 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.273481 2576 generic.go:358] "Generic (PLEG): container finished" podID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerID="f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099" exitCode=0 Apr 16 18:55:50.273701 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.273541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerDied","Data":"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099"} Apr 16 18:55:50.888917 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.888892 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:55:50.898077 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898053 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898231 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898100 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6p7d\" (UniqueName: \"kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898231 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898136 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898231 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898168 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898388 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898323 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.898388 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898326 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898562 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898434 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs\") pod \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\" (UID: \"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0\") " Apr 16 18:55:50.898562 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898438 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.898661 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898618 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.898728 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898712 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.898789 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898730 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.898789 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898739 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.899053 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.898960 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:55:50.900555 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.900529 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d" (OuterVolumeSpecName: "kube-api-access-h6p7d") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "kube-api-access-h6p7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:55:50.900658 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.900592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" (UID: "5d8a0bb9-1a16-4f8c-9687-a8725855dbb0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:55:50.999630 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.999527 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.999630 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.999568 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h6p7d\" (UniqueName: \"kubernetes.io/projected/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kube-api-access-h6p7d\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:50.999630 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:50.999581 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:55:51.278970 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.278878 2576 generic.go:358] "Generic (PLEG): container finished" podID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerID="a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029" exitCode=0 Apr 16 18:55:51.278970 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.278937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerDied","Data":"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029"} Apr 16 18:55:51.278970 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.278954 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" Apr 16 18:55:51.278970 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.278970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64" event={"ID":"5d8a0bb9-1a16-4f8c-9687-a8725855dbb0","Type":"ContainerDied","Data":"6ffed7d7e00c8c46d6e500654d8297ba7fd25ec75b2489c13fee6ad5a9e956ab"} Apr 16 18:55:51.279487 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.278991 2576 scope.go:117] "RemoveContainer" containerID="a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029" Apr 16 18:55:51.288898 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.288875 2576 scope.go:117] "RemoveContainer" containerID="f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099" Apr 16 18:55:51.296542 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.296523 2576 scope.go:117] "RemoveContainer" containerID="b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8" Apr 16 18:55:51.302322 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.302299 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:55:51.305414 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.305342 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schew7f64"] Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.305879 2576 scope.go:117] "RemoveContainer" containerID="a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:55:51.306280 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029\": container with ID starting with a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029 not found: ID does not exist" containerID="a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.306318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029"} err="failed to get container status \"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029\": rpc error: code = NotFound desc = could not find container \"a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029\": container with ID starting with a0e43b704a27de472a52ed45c2ebd083b7e27c0c9a0a7c2eee11f3c56bb90029 not found: ID does not exist" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.306343 2576 scope.go:117] "RemoveContainer" containerID="f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:55:51.306755 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099\": container with ID starting with f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099 not found: ID does not exist" containerID="f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.306782 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099"} err="failed to get container status \"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099\": rpc error: code = NotFound desc = could not find container \"f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099\": container with ID starting with f0dfcfe51b44d5e201e47f84fc440f7691ce3d7ecbf6819431f20bf1809e1099 not found: ID does not exist" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.306801 2576 scope.go:117] "RemoveContainer" containerID="b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:55:51.307091 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8\": container with ID starting with b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8 not found: ID does not exist" containerID="b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8" Apr 16 18:55:51.307904 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:51.307113 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8"} err="failed to get container status \"b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8\": rpc error: code = NotFound desc = could not find container \"b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8\": container with ID starting with b2f0c1d2721d38b9247cc68266ca61c94f72514f09dbe7a78916e41185b86ab8 not found: ID does not exist" Apr 16 18:55:52.179614 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.179583 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" path="/var/lib/kubelet/pods/5d8a0bb9-1a16-4f8c-9687-a8725855dbb0/volumes" Apr 16 18:55:52.519141 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519061 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:55:52.519440 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519427 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="tokenizer" Apr 16 18:55:52.519494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519443 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="tokenizer" Apr 16 18:55:52.519494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519462 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="storage-initializer" Apr 16 18:55:52.519494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519468 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="storage-initializer" Apr 16 18:55:52.519494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519479 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="main" Apr 16 18:55:52.519494 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519484 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="main" Apr 16 18:55:52.519639 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519541 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="main" Apr 16 18:55:52.519639 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.519550 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d8a0bb9-1a16-4f8c-9687-a8725855dbb0" containerName="tokenizer" Apr 16 18:55:52.524097 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.524080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.527074 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.527048 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 18:55:52.535957 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.535932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:55:52.613869 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.613832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clp8s\" (UniqueName: \"kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.614115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.613890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.614115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.613925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.614115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.613948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.614115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.613992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.614115 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.614015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715007 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.714966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clp8s\" (UniqueName: \"kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715186 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715533 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715587 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.715628 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.715596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.717606 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.717579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.717785 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.717767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.725086 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.725063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clp8s\" (UniqueName: \"kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.833784 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.833742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:55:52.990809 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:55:52.990762 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a5bb96_a09c_41fb_bd96_42b5f4146433.slice/crio-e9025a73d257b07a17dd5e13e5288d9e71459749b602931a37a86384c691ae56 WatchSource:0}: Error finding container e9025a73d257b07a17dd5e13e5288d9e71459749b602931a37a86384c691ae56: Status 404 returned error can't find the container with id e9025a73d257b07a17dd5e13e5288d9e71459749b602931a37a86384c691ae56 Apr 16 18:55:52.992749 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:52.992715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:55:53.287105 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:53.287009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerStarted","Data":"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7"} Apr 16 18:55:53.287105 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:53.287048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerStarted","Data":"e9025a73d257b07a17dd5e13e5288d9e71459749b602931a37a86384c691ae56"} Apr 16 18:55:57.302097 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:57.302064 2576 generic.go:358] "Generic (PLEG): container finished" podID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerID="9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7" exitCode=0 Apr 16 18:55:57.302552 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:57.302142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerDied","Data":"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7"} Apr 16 18:55:58.307900 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:58.307865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerStarted","Data":"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34"} Apr 16 18:55:58.325895 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:55:58.325833 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" podStartSLOduration=6.325815531 podStartE2EDuration="6.325815531s" podCreationTimestamp="2026-04-16 18:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:55:58.324971134 +0000 UTC m=+1532.791526020" watchObservedRunningTime="2026-04-16 18:55:58.325815531 +0000 UTC m=+1532.792370417" Apr 16 18:56:02.833995 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:02.833958 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:02.833995 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:02.834003 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:02.846777 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:02.846748 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:03.337967 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:03.337937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:26.610336 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:26.610287 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:56:26.610874 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:26.610697 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="main" containerID="cri-o://73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34" gracePeriod=30 Apr 16 18:56:26.863174 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:26.863108 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:27.028479 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028442 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028545 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028574 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028679 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028599 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clp8s\" (UniqueName: \"kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s\") pod \"84a5bb96-a09c-41fb-bd96-42b5f4146433\" (UID: \"84a5bb96-a09c-41fb-bd96-42b5f4146433\") " Apr 16 18:56:27.028968 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home" (OuterVolumeSpecName: "home") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:27.028968 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.028915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache" (OuterVolumeSpecName: "model-cache") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:27.030999 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.030969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:56:27.030999 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.030984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm" (OuterVolumeSpecName: "dshm") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:27.031270 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.031251 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s" (OuterVolumeSpecName: "kube-api-access-clp8s") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "kube-api-access-clp8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:56:27.086012 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.085966 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84a5bb96-a09c-41fb-bd96-42b5f4146433" (UID: "84a5bb96-a09c-41fb-bd96-42b5f4146433"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130026 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130057 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-dshm\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130071 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-home\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130082 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a5bb96-a09c-41fb-bd96-42b5f4146433-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130095 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a5bb96-a09c-41fb-bd96-42b5f4146433-model-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.130123 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.130107 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clp8s\" (UniqueName: \"kubernetes.io/projected/84a5bb96-a09c-41fb-bd96-42b5f4146433-kube-api-access-clp8s\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:27.414193 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.414095 2576 generic.go:358] "Generic (PLEG): container finished" podID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerID="73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34" exitCode=0 Apr 16 18:56:27.414381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.414182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerDied","Data":"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34"} Apr 16 18:56:27.414381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.414203 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" Apr 16 18:56:27.414381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.414223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6" event={"ID":"84a5bb96-a09c-41fb-bd96-42b5f4146433","Type":"ContainerDied","Data":"e9025a73d257b07a17dd5e13e5288d9e71459749b602931a37a86384c691ae56"} Apr 16 18:56:27.414381 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.414239 2576 scope.go:117] "RemoveContainer" containerID="73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34" Apr 16 18:56:27.423145 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.423126 2576 scope.go:117] "RemoveContainer" containerID="9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7" Apr 16 18:56:27.435177 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.435153 2576 scope.go:117] "RemoveContainer" containerID="73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34" Apr 16 18:56:27.435536 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:56:27.435513 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34\": container with ID starting with 73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34 not found: ID does not exist" containerID="73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34" Apr 16 18:56:27.435634 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.435558 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34"} err="failed to get container status \"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34\": rpc error: code = NotFound desc = could not find container \"73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34\": container with ID starting with 73c6760f764fc65437c56fe6323724844c198b563172bdfcb86d13320ceb8f34 not found: ID does not exist" Apr 16 18:56:27.435634 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.435585 2576 scope.go:117] "RemoveContainer" containerID="9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7" Apr 16 18:56:27.435867 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:56:27.435843 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7\": container with ID starting with 9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7 not found: ID does not exist" containerID="9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7" Apr 16 18:56:27.435922 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.435875 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7"} err="failed to get container status \"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7\": rpc error: code = NotFound desc = could not find container \"9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7\": container with ID starting with 9ee21c0894b5f02cb86a8b92bada06f8a37fe1d2bd67460d9ec2a637c73947a7 not found: ID does not exist" Apr 16 18:56:27.436660 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.436641 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:56:27.440413 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:27.440373 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-4d4v6"] Apr 16 18:56:28.179583 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:28.179549 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" path="/var/lib/kubelet/pods/84a5bb96-a09c-41fb-bd96-42b5f4146433/volumes" Apr 16 18:56:37.894382 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:37.894331 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:56:37.894992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:37.894675 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="main" containerID="cri-o://d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883" gracePeriod=30 Apr 16 18:56:37.894992 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:37.894786 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="tokenizer" containerID="cri-o://a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14" gracePeriod=30 Apr 16 18:56:38.454075 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:38.454038 2576 generic.go:358] "Generic (PLEG): container finished" podID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerID="d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883" exitCode=0 Apr 16 18:56:38.454271 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:38.454105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerDied","Data":"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883"} Apr 16 18:56:39.247776 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.247745 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:56:39.333442 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333313 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333442 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333358 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333442 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333378 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnb7t\" (UniqueName: \"kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333442 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333421 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333460 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333494 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp\") pod \"3f7faa08-a15d-4c4e-8a77-f838669b928e\" (UID: \"3f7faa08-a15d-4c4e-8a77-f838669b928e\") " Apr 16 18:56:39.333795 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333698 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:39.333950 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333823 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:39.333950 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.333847 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.334189 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.334159 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:39.334780 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.334747 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:39.335871 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.335844 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t" (OuterVolumeSpecName: "kube-api-access-xnb7t") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "kube-api-access-xnb7t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:56:39.335871 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.335853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3f7faa08-a15d-4c4e-8a77-f838669b928e" (UID: "3f7faa08-a15d-4c4e-8a77-f838669b928e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:56:39.435122 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.435084 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.435122 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.435118 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.435122 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.435129 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f7faa08-a15d-4c4e-8a77-f838669b928e-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.435356 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.435138 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f7faa08-a15d-4c4e-8a77-f838669b928e-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.435356 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.435146 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnb7t\" (UniqueName: \"kubernetes.io/projected/3f7faa08-a15d-4c4e-8a77-f838669b928e-kube-api-access-xnb7t\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:56:39.465605 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.465566 2576 generic.go:358] "Generic (PLEG): container finished" podID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerID="a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14" exitCode=0 Apr 16 18:56:39.465787 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.465652 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" Apr 16 18:56:39.465787 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.465643 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerDied","Data":"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14"} Apr 16 18:56:39.465787 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.465753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv" event={"ID":"3f7faa08-a15d-4c4e-8a77-f838669b928e","Type":"ContainerDied","Data":"5149345b99db21fc589c393b7c27aca6437c083eb3ee5a76a13e073a07838e72"} Apr 16 18:56:39.465787 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.465770 2576 scope.go:117] "RemoveContainer" containerID="a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14" Apr 16 18:56:39.474509 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.474491 2576 scope.go:117] "RemoveContainer" containerID="d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883" Apr 16 18:56:39.482041 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.482015 2576 scope.go:117] "RemoveContainer" containerID="49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737" Apr 16 18:56:39.487513 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.487489 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:56:39.490058 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.490019 2576 scope.go:117] "RemoveContainer" containerID="a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14" Apr 16 18:56:39.490460 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:56:39.490432 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14\": container with ID starting with a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14 not found: ID does not exist" containerID="a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14" Apr 16 18:56:39.490627 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.490590 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14"} err="failed to get container status \"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14\": rpc error: code = NotFound desc = could not find container \"a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14\": container with ID starting with a24fb3a1dc86f59e6726c8f3be8d3e26db8d825b2d4d270d197df6734a75bb14 not found: ID does not exist" Apr 16 18:56:39.490746 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.490629 2576 scope.go:117] "RemoveContainer" containerID="d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883" Apr 16 18:56:39.490941 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:56:39.490916 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883\": container with ID starting with d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883 not found: ID does not exist" containerID="d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883" Apr 16 18:56:39.491009 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.490946 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883"} err="failed to get container status \"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883\": rpc error: code = NotFound desc = could not find container \"d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883\": container with ID starting with d39f78bce265e49a16e02654aa1355921b70fb0019c921f202383420198e3883 not found: ID does not exist" Apr 16 18:56:39.491009 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.490960 2576 scope.go:117] "RemoveContainer" containerID="49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737" Apr 16 18:56:39.491241 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:56:39.491223 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737\": container with ID starting with 49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737 not found: ID does not exist" containerID="49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737" Apr 16 18:56:39.491302 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.491247 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737"} err="failed to get container status \"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737\": rpc error: code = NotFound desc = could not find container \"49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737\": container with ID starting with 49c86267701cd0ddeb35634bcc02d2bac1d9350f0278129dfa6e26054e1d5737 not found: ID does not exist" Apr 16 18:56:39.491827 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:39.491808 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb59xchv"] Apr 16 18:56:40.179643 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:40.179610 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" path="/var/lib/kubelet/pods/3f7faa08-a15d-4c4e-8a77-f838669b928e/volumes" Apr 16 18:56:43.479820 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.479780 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:56:43.480867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480845 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="tokenizer" Apr 16 18:56:43.480867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480867 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="tokenizer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480878 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="storage-initializer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480884 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="storage-initializer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480908 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="main" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480914 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="main" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480920 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="main" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480925 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="main" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480936 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="storage-initializer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480942 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="storage-initializer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.480997 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="84a5bb96-a09c-41fb-bd96-42b5f4146433" containerName="main" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.481007 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="tokenizer" Apr 16 18:56:43.481069 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.481015 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f7faa08-a15d-4c4e-8a77-f838669b928e" containerName="main" Apr 16 18:56:43.485875 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.485851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.488672 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.488647 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 18:56:43.488830 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.488789 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 18:56:43.489353 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.489337 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 18:56:43.489472 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.489373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-qvhwc\"" Apr 16 18:56:43.489472 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.489428 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xbj66\"" Apr 16 18:56:43.493648 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.493606 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:56:43.570941 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.570906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.570941 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.570943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptn88\" (UniqueName: \"kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.571184 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.570962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.571184 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.570985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.571184 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.571041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.571184 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.571080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672524 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptn88\" (UniqueName: \"kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.672718 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.673015 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.673015 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.672958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.673108 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.673014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.673108 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.673042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.675380 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.675361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.681519 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.681494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptn88\" (UniqueName: \"kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.798059 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.797966 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:43.953468 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:43.953441 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:56:43.956020 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:56:43.955991 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53fed02e_5bf8_4978_9442_835f0ea7b9ab.slice/crio-60519953ab0f6c903006892d29a5f083e69627a3e3efff948a1888f6a5cdbd49 WatchSource:0}: Error finding container 60519953ab0f6c903006892d29a5f083e69627a3e3efff948a1888f6a5cdbd49: Status 404 returned error can't find the container with id 60519953ab0f6c903006892d29a5f083e69627a3e3efff948a1888f6a5cdbd49 Apr 16 18:56:44.489033 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:44.488994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerStarted","Data":"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02"} Apr 16 18:56:44.489521 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:44.489041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerStarted","Data":"60519953ab0f6c903006892d29a5f083e69627a3e3efff948a1888f6a5cdbd49"} Apr 16 18:56:45.494565 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:45.494524 2576 generic.go:358] "Generic (PLEG): container finished" podID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerID="c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02" exitCode=0 Apr 16 18:56:45.494974 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:45.494611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerDied","Data":"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02"} Apr 16 18:56:46.500864 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:46.500826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerStarted","Data":"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b"} Apr 16 18:56:46.500864 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:46.500870 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerStarted","Data":"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20"} Apr 16 18:56:46.501460 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:46.500965 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:46.521793 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:46.521715 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" podStartSLOduration=3.521698134 podStartE2EDuration="3.521698134s" podCreationTimestamp="2026-04-16 18:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:46.520193022 +0000 UTC m=+1580.986747932" watchObservedRunningTime="2026-04-16 18:56:46.521698134 +0000 UTC m=+1580.988253020" Apr 16 18:56:53.798550 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:53.798485 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:53.799003 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:53.798565 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:53.801475 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:53.801451 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:56:54.537157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:56:54.537131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:57:15.541262 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:57:15.541228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:59:25.447310 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:25.447274 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:59:25.447896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:25.447741 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="main" containerID="cri-o://be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20" gracePeriod=30 Apr 16 18:59:25.447896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:25.447842 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="tokenizer" containerID="cri-o://f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b" gracePeriod=30 Apr 16 18:59:25.540453 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:59:25.540387 2576 logging.go:55] [core] [Channel #773 SubChannel #774]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.42:9003", ServerName: "10.132.0.42:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.42:9003: connect: connection refused" Apr 16 18:59:26.016492 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.016455 2576 generic.go:358] "Generic (PLEG): container finished" podID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerID="be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20" exitCode=0 Apr 16 18:59:26.016678 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.016530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerDied","Data":"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20"} Apr 16 18:59:26.540569 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.540518 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.42:9003\" within 1s: context deadline exceeded" Apr 16 18:59:26.799963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.799900 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:59:26.826488 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfvn2/must-gather-vvnsf"] Apr 16 18:59:26.826807 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826795 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="storage-initializer" Apr 16 18:59:26.826867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826809 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="storage-initializer" Apr 16 18:59:26.826867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826818 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="tokenizer" Apr 16 18:59:26.826867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826823 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="tokenizer" Apr 16 18:59:26.826867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826842 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="main" Apr 16 18:59:26.826867 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826848 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="main" Apr 16 18:59:26.827036 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826903 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="main" Apr 16 18:59:26.827036 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.826911 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerName="tokenizer" Apr 16 18:59:26.830432 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.830382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.835228 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.835202 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfvn2\"/\"openshift-service-ca.crt\"" Apr 16 18:59:26.835363 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.835201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pfvn2\"/\"default-dockercfg-65ct7\"" Apr 16 18:59:26.835363 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.835201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfvn2\"/\"kube-root-ca.crt\"" Apr 16 18:59:26.846157 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.846128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfvn2/must-gather-vvnsf"] Apr 16 18:59:26.870259 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870259 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870267 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870287 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870306 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptn88\" (UniqueName: \"kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870375 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs\") pod \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\" (UID: \"53fed02e-5bf8-4978-9442-835f0ea7b9ab\") " Apr 16 18:59:26.870534 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phk6z\" (UniqueName: \"kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.870766 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.870766 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870619 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:26.870766 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870632 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:26.870766 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.870681 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:26.871263 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.871243 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:26.872755 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.872728 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:59:26.872860 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.872802 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88" (OuterVolumeSpecName: "kube-api-access-ptn88") pod "53fed02e-5bf8-4978-9442-835f0ea7b9ab" (UID: "53fed02e-5bf8-4978-9442-835f0ea7b9ab"). InnerVolumeSpecName "kube-api-access-ptn88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:59:26.971428 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phk6z\" (UniqueName: \"kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971526 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-cache\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971538 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-uds\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971548 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tokenizer-tmp\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971557 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptn88\" (UniqueName: \"kubernetes.io/projected/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kube-api-access-ptn88\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971565 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53fed02e-5bf8-4978-9442-835f0ea7b9ab-kserve-provision-location\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971610 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971575 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53fed02e-5bf8-4978-9442-835f0ea7b9ab-tls-certs\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 18:59:26.971861 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.971836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:26.990727 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:26.990686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phk6z\" (UniqueName: \"kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z\") pod \"must-gather-vvnsf\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:27.028373 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.028332 2576 generic.go:358] "Generic (PLEG): container finished" podID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" containerID="f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b" exitCode=0 Apr 16 18:59:27.028594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.028440 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerDied","Data":"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b"} Apr 16 18:59:27.028594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.028481 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" Apr 16 18:59:27.028594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.028504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv" event={"ID":"53fed02e-5bf8-4978-9442-835f0ea7b9ab","Type":"ContainerDied","Data":"60519953ab0f6c903006892d29a5f083e69627a3e3efff948a1888f6a5cdbd49"} Apr 16 18:59:27.028594 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.028543 2576 scope.go:117] "RemoveContainer" containerID="f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b" Apr 16 18:59:27.037618 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.037596 2576 scope.go:117] "RemoveContainer" containerID="be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20" Apr 16 18:59:27.045194 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.045176 2576 scope.go:117] "RemoveContainer" containerID="c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02" Apr 16 18:59:27.052897 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.052878 2576 scope.go:117] "RemoveContainer" containerID="f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b" Apr 16 18:59:27.053166 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:59:27.053140 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b\": container with ID starting with f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b not found: ID does not exist" containerID="f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b" Apr 16 18:59:27.053215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.053172 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b"} err="failed to get container status \"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b\": rpc error: code = NotFound desc = could not find container \"f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b\": container with ID starting with f3dfe89c5461ddbc89a8e5911aeecbdef2ac6fef87a6ae30b3b30d4392789b9b not found: ID does not exist" Apr 16 18:59:27.053215 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.053192 2576 scope.go:117] "RemoveContainer" containerID="be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20" Apr 16 18:59:27.053440 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:59:27.053417 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20\": container with ID starting with be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20 not found: ID does not exist" containerID="be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20" Apr 16 18:59:27.053493 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.053450 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20"} err="failed to get container status \"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20\": rpc error: code = NotFound desc = could not find container \"be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20\": container with ID starting with be22ceef9af1a0527013bb496b637587d7e27f40bda9a637f5d5433d47994c20 not found: ID does not exist" Apr 16 18:59:27.053493 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.053473 2576 scope.go:117] "RemoveContainer" containerID="c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02" Apr 16 18:59:27.053752 ip-10-0-137-47 kubenswrapper[2576]: E0416 18:59:27.053729 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02\": container with ID starting with c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02 not found: ID does not exist" containerID="c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02" Apr 16 18:59:27.053856 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.053751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02"} err="failed to get container status \"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02\": rpc error: code = NotFound desc = could not find container \"c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02\": container with ID starting with c7d4e8e3b354c242bb5439658cddc81e712279b0d23b1b11c007ce7eca99ce02 not found: ID does not exist" Apr 16 18:59:27.056126 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.056104 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:59:27.064799 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.064773 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b544vr7fv"] Apr 16 18:59:27.140963 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.140929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 18:59:27.267045 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.267012 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfvn2/must-gather-vvnsf"] Apr 16 18:59:27.270351 ip-10-0-137-47 kubenswrapper[2576]: W0416 18:59:27.270324 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode926054c_7b2a_4a1e_b7fc_af884f412555.slice/crio-9c64f4efa712473e92d9a373d357d2f3da6b35fd602af8526b45bdf2bea21b32 WatchSource:0}: Error finding container 9c64f4efa712473e92d9a373d357d2f3da6b35fd602af8526b45bdf2bea21b32: Status 404 returned error can't find the container with id 9c64f4efa712473e92d9a373d357d2f3da6b35fd602af8526b45bdf2bea21b32 Apr 16 18:59:27.271896 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:27.271882 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:59:28.035200 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:28.035142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" event={"ID":"e926054c-7b2a-4a1e-b7fc-af884f412555","Type":"ContainerStarted","Data":"9c64f4efa712473e92d9a373d357d2f3da6b35fd602af8526b45bdf2bea21b32"} Apr 16 18:59:28.181323 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:28.181281 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53fed02e-5bf8-4978-9442-835f0ea7b9ab" path="/var/lib/kubelet/pods/53fed02e-5bf8-4978-9442-835f0ea7b9ab/volumes" Apr 16 18:59:32.051997 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:32.051958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" event={"ID":"e926054c-7b2a-4a1e-b7fc-af884f412555","Type":"ContainerStarted","Data":"96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097"} Apr 16 18:59:32.051997 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:32.051994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" event={"ID":"e926054c-7b2a-4a1e-b7fc-af884f412555","Type":"ContainerStarted","Data":"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444"} Apr 16 18:59:32.067574 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:32.067521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" podStartSLOduration=2.144438703 podStartE2EDuration="6.067506096s" podCreationTimestamp="2026-04-16 18:59:26 +0000 UTC" firstStartedPulling="2026-04-16 18:59:27.272000542 +0000 UTC m=+1741.738555407" lastFinishedPulling="2026-04-16 18:59:31.195067937 +0000 UTC m=+1745.661622800" observedRunningTime="2026-04-16 18:59:32.066387869 +0000 UTC m=+1746.532942881" watchObservedRunningTime="2026-04-16 18:59:32.067506096 +0000 UTC m=+1746.534061018" Apr 16 18:59:55.485584 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:55.485556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64bf8854b4-776ph_3e312f71-4f6a-4206-99c4-62f2f2ab84ef/router/0.log" Apr 16 18:59:56.328939 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:56.328905 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64bf8854b4-776ph_3e312f71-4f6a-4206-99c4-62f2f2ab84ef/router/0.log" Apr 16 18:59:57.151667 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:57.151638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-b28b6_c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f/kuadrant-console-plugin/0.log" Apr 16 18:59:58.150707 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:58.150620 2576 generic.go:358] "Generic (PLEG): container finished" podID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerID="4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444" exitCode=0 Apr 16 18:59:58.150707 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:58.150656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" event={"ID":"e926054c-7b2a-4a1e-b7fc-af884f412555","Type":"ContainerDied","Data":"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444"} Apr 16 18:59:58.150978 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:58.150962 2576 scope.go:117] "RemoveContainer" containerID="4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444" Apr 16 18:59:58.891355 ip-10-0-137-47 kubenswrapper[2576]: I0416 18:59:58.891312 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfvn2_must-gather-vvnsf_e926054c-7b2a-4a1e-b7fc-af884f412555/gather/0.log" Apr 16 19:00:02.696760 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:02.696719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ggqxt_8617aaa8-5382-49c3-9fbd-7f66b89d8525/global-pull-secret-syncer/0.log" Apr 16 19:00:02.801561 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:02.801522 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j85qr_dffbf089-0f9c-412d-8cef-d3e8343e0951/konnectivity-agent/0.log" Apr 16 19:00:02.889782 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:02.889730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-47.ec2.internal_17da44616c39894cc6f4732c6b243af1/haproxy/0.log" Apr 16 19:00:04.417115 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.417077 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pfvn2/must-gather-vvnsf"] Apr 16 19:00:04.417546 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.417318 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="copy" containerID="cri-o://96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097" gracePeriod=2 Apr 16 19:00:04.422066 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.421358 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pfvn2/must-gather-vvnsf"] Apr 16 19:00:04.647997 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.647974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfvn2_must-gather-vvnsf_e926054c-7b2a-4a1e-b7fc-af884f412555/copy/0.log" Apr 16 19:00:04.648328 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.648302 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 19:00:04.650531 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.650505 2576 status_manager.go:895] "Failed to get status for pod" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" err="pods \"must-gather-vvnsf\" is forbidden: User \"system:node:ip-10-0-137-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfvn2\": no relationship found between node 'ip-10-0-137-47.ec2.internal' and this object" Apr 16 19:00:04.805561 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.805531 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phk6z\" (UniqueName: \"kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z\") pod \"e926054c-7b2a-4a1e-b7fc-af884f412555\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " Apr 16 19:00:04.805754 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.805608 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output\") pod \"e926054c-7b2a-4a1e-b7fc-af884f412555\" (UID: \"e926054c-7b2a-4a1e-b7fc-af884f412555\") " Apr 16 19:00:04.808045 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.808007 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z" (OuterVolumeSpecName: "kube-api-access-phk6z") pod "e926054c-7b2a-4a1e-b7fc-af884f412555" (UID: "e926054c-7b2a-4a1e-b7fc-af884f412555"). InnerVolumeSpecName "kube-api-access-phk6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:00:04.813856 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.813820 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e926054c-7b2a-4a1e-b7fc-af884f412555" (UID: "e926054c-7b2a-4a1e-b7fc-af884f412555"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:00:04.906487 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.906445 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phk6z\" (UniqueName: \"kubernetes.io/projected/e926054c-7b2a-4a1e-b7fc-af884f412555-kube-api-access-phk6z\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 19:00:04.906487 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:04.906483 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e926054c-7b2a-4a1e-b7fc-af884f412555-must-gather-output\") on node \"ip-10-0-137-47.ec2.internal\" DevicePath \"\"" Apr 16 19:00:05.175594 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.175516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pfvn2_must-gather-vvnsf_e926054c-7b2a-4a1e-b7fc-af884f412555/copy/0.log" Apr 16 19:00:05.175878 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.175858 2576 generic.go:358] "Generic (PLEG): container finished" podID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerID="96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097" exitCode=143 Apr 16 19:00:05.175930 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.175895 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" Apr 16 19:00:05.175998 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.175979 2576 scope.go:117] "RemoveContainer" containerID="96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097" Apr 16 19:00:05.178026 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.177998 2576 status_manager.go:895] "Failed to get status for pod" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" err="pods \"must-gather-vvnsf\" is forbidden: User \"system:node:ip-10-0-137-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfvn2\": no relationship found between node 'ip-10-0-137-47.ec2.internal' and this object" Apr 16 19:00:05.184218 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.184200 2576 scope.go:117] "RemoveContainer" containerID="4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444" Apr 16 19:00:05.186028 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.186002 2576 status_manager.go:895] "Failed to get status for pod" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" err="pods \"must-gather-vvnsf\" is forbidden: User \"system:node:ip-10-0-137-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfvn2\": no relationship found between node 'ip-10-0-137-47.ec2.internal' and this object" Apr 16 19:00:05.198056 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.198036 2576 scope.go:117] "RemoveContainer" containerID="96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097" Apr 16 19:00:05.198368 ip-10-0-137-47 kubenswrapper[2576]: E0416 19:00:05.198348 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097\": container with ID starting with 96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097 not found: ID does not exist" containerID="96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097" Apr 16 19:00:05.198447 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.198379 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097"} err="failed to get container status \"96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097\": rpc error: code = NotFound desc = could not find container \"96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097\": container with ID starting with 96917382695a082746bec2b931c7f909bfd92b53e654a4b998127e678fb73097 not found: ID does not exist" Apr 16 19:00:05.198447 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.198413 2576 scope.go:117] "RemoveContainer" containerID="4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444" Apr 16 19:00:05.198629 ip-10-0-137-47 kubenswrapper[2576]: E0416 19:00:05.198612 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444\": container with ID starting with 4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444 not found: ID does not exist" containerID="4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444" Apr 16 19:00:05.198676 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:05.198635 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444"} err="failed to get container status \"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444\": rpc error: code = NotFound desc = could not find container \"4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444\": container with ID starting with 4a0292ae149af15a76c590358704819636d1cd49386ab5250ed44e2fc622f444 not found: ID does not exist" Apr 16 19:00:06.179063 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:06.179031 2576 status_manager.go:895] "Failed to get status for pod" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" pod="openshift-must-gather-pfvn2/must-gather-vvnsf" err="pods \"must-gather-vvnsf\" is forbidden: User \"system:node:ip-10-0-137-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-pfvn2\": no relationship found between node 'ip-10-0-137-47.ec2.internal' and this object" Apr 16 19:00:06.180460 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:06.180438 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" path="/var/lib/kubelet/pods/e926054c-7b2a-4a1e-b7fc-af884f412555/volumes" Apr 16 19:00:07.163548 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:07.163517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-b28b6_c771b2d7-ddc6-4d54-bfd4-d50d3bfd300f/kuadrant-console-plugin/0.log" Apr 16 19:00:08.419495 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:08.419387 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-jsghj_592a7c8f-97a7-4307-9682-3926fa559c11/cluster-monitoring-operator/0.log" Apr 16 19:00:08.722958 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:08.722869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9dv_c4b49530-da68-40c1-86b7-5787b7b11a79/node-exporter/0.log" Apr 16 19:00:08.743182 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:08.743148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9dv_c4b49530-da68-40c1-86b7-5787b7b11a79/kube-rbac-proxy/0.log" Apr 16 19:00:08.772946 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:08.772917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9dv_c4b49530-da68-40c1-86b7-5787b7b11a79/init-textfile/0.log" Apr 16 19:00:10.645536 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:10.645503 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-t9jsh_fcc6daec-498a-4d51-950c-80666fb565da/networking-console-plugin/0.log" Apr 16 19:00:11.162191 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.162155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 19:00:11.173119 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.173085 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/2.log" Apr 16 19:00:11.629065 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.629016 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-fsthh_b4b5ffce-2f92-4b13-b96b-d7fa243d1a13/download-server/0.log" Apr 16 19:00:11.749028 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.748966 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz"] Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749270 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="gather" Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749281 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="gather" Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749290 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="copy" Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749295 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="copy" Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749348 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="gather" Apr 16 19:00:11.749435 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.749357 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e926054c-7b2a-4a1e-b7fc-af884f412555" containerName="copy" Apr 16 19:00:11.756002 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.755975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.759144 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.759119 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"kube-root-ca.crt\"" Apr 16 19:00:11.759543 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.759143 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-djjwc\"/\"default-dockercfg-bxwpq\"" Apr 16 19:00:11.759543 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.759187 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-djjwc\"/\"openshift-service-ca.crt\"" Apr 16 19:00:11.760148 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.760129 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz"] Apr 16 19:00:11.869784 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.869749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-podres\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.869784 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.869786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4gs\" (UniqueName: \"kubernetes.io/projected/6e7f4b31-329a-463d-ae93-a31ef03fd18d-kube-api-access-sr4gs\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.870028 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.869831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-proc\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.870028 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.869850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-sys\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.870028 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.869869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-lib-modules\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.970931 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.970823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-proc\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.970931 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.970871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-sys\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.970949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-sys\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.970953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-proc\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.970995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-lib-modules\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.971065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-podres\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.971128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4gs\" (UniqueName: \"kubernetes.io/projected/6e7f4b31-329a-463d-ae93-a31ef03fd18d-kube-api-access-sr4gs\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971159 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.971149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-podres\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.971375 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.971163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6e7f4b31-329a-463d-ae93-a31ef03fd18d-lib-modules\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:11.979252 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:11.979221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4gs\" (UniqueName: \"kubernetes.io/projected/6e7f4b31-329a-463d-ae93-a31ef03fd18d-kube-api-access-sr4gs\") pod \"perf-node-gather-daemonset-khqwz\" (UID: \"6e7f4b31-329a-463d-ae93-a31ef03fd18d\") " pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:12.067297 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:12.067258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:12.083987 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:12.083935 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-7nw2l_f927df6e-69e1-4c13-8409-28c80b811150/volume-data-source-validator/0.log" Apr 16 19:00:12.200817 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:12.200793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz"] Apr 16 19:00:12.203221 ip-10-0-137-47 kubenswrapper[2576]: W0416 19:00:12.203195 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6e7f4b31_329a_463d_ae93_a31ef03fd18d.slice/crio-67d7da711aadf090c01cdec103360539c1217ef8ddd3c1961410b8f61da401f2 WatchSource:0}: Error finding container 67d7da711aadf090c01cdec103360539c1217ef8ddd3c1961410b8f61da401f2: Status 404 returned error can't find the container with id 67d7da711aadf090c01cdec103360539c1217ef8ddd3c1961410b8f61da401f2 Apr 16 19:00:12.838813 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:12.838779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9ms5f_cef0db6d-a3ae-4198-8447-b4ee557da9d1/dns/0.log" Apr 16 19:00:12.860628 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:12.860588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9ms5f_cef0db6d-a3ae-4198-8447-b4ee557da9d1/kube-rbac-proxy/0.log" Apr 16 19:00:13.019349 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.019314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rs8w8_129c086c-bc70-4407-a43e-26664dfb816c/dns-node-resolver/0.log" Apr 16 19:00:13.204988 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.204889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" event={"ID":"6e7f4b31-329a-463d-ae93-a31ef03fd18d","Type":"ContainerStarted","Data":"089c307003927225d8e92b52a04b1e518dbf6f9e2452f1a1e3685c61d055eabd"} Apr 16 19:00:13.204988 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.204925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" event={"ID":"6e7f4b31-329a-463d-ae93-a31ef03fd18d","Type":"ContainerStarted","Data":"67d7da711aadf090c01cdec103360539c1217ef8ddd3c1961410b8f61da401f2"} Apr 16 19:00:13.205219 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.205001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:13.225204 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.225150 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" podStartSLOduration=2.225132474 podStartE2EDuration="2.225132474s" podCreationTimestamp="2026-04-16 19:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:00:13.224050878 +0000 UTC m=+1787.690605774" watchObservedRunningTime="2026-04-16 19:00:13.225132474 +0000 UTC m=+1787.691687351" Apr 16 19:00:13.534728 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:13.534629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-9k6mz_1c74f02e-39bc-4ee2-bd6c-07d23ece32a2/node-ca/0.log" Apr 16 19:00:14.494488 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:14.494452 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-64bf8854b4-776ph_3e312f71-4f6a-4206-99c4-62f2f2ab84ef/router/0.log" Apr 16 19:00:14.944983 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:14.944947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7cbsl_2bdf3a87-71c1-4f98-8f1f-f3ebbbbf916f/serve-healthcheck-canary/0.log" Apr 16 19:00:15.450022 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:15.449991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-g7zhh_3fe5dd28-9069-4e1e-9331-ddd24da0b5f2/insights-operator/0.log" Apr 16 19:00:15.452939 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:15.452915 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-g7zhh_3fe5dd28-9069-4e1e-9331-ddd24da0b5f2/insights-operator/1.log" Apr 16 19:00:15.628259 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:15.628228 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pjfc2_671bae0e-2470-403d-b4f3-7c607959438a/kube-rbac-proxy/0.log" Apr 16 19:00:15.650547 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:15.650523 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pjfc2_671bae0e-2470-403d-b4f3-7c607959438a/exporter/0.log" Apr 16 19:00:15.674201 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:15.674170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pjfc2_671bae0e-2470-403d-b4f3-7c607959438a/extractor/0.log" Apr 16 19:00:18.245172 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:18.245138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7fd84c546d-lxlf4_331f686d-81a7-475d-8b25-fa2ec126dc59/manager/0.log" Apr 16 19:00:18.983560 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:18.983533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-r2nnm_346719e6-ab00-4d86-86d0-7327fe9168a6/server/0.log" Apr 16 19:00:19.219216 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:19.219175 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-djjwc/perf-node-gather-daemonset-khqwz" Apr 16 19:00:24.030020 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:24.029991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-bmcbm_1d6751a4-0f7f-4439-844e-5585a26c5f43/migrator/0.log" Apr 16 19:00:24.049693 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:24.049660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-bmcbm_1d6751a4-0f7f-4439-844e-5585a26c5f43/graceful-termination/0.log" Apr 16 19:00:24.393345 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:24.393312 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vw2xc_3e48aa88-413f-40b4-bf6a-2dc0acc72e3a/kube-storage-version-migrator-operator/1.log" Apr 16 19:00:24.394773 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:24.394749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vw2xc_3e48aa88-413f-40b4-bf6a-2dc0acc72e3a/kube-storage-version-migrator-operator/0.log" Apr 16 19:00:25.358922 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.358889 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/kube-multus-additional-cni-plugins/0.log" Apr 16 19:00:25.381679 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.381644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/egress-router-binary-copy/0.log" Apr 16 19:00:25.402142 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.402103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/cni-plugins/0.log" Apr 16 19:00:25.422938 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.422906 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/bond-cni-plugin/0.log" Apr 16 19:00:25.446371 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.446341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/routeoverride-cni/0.log" Apr 16 19:00:25.467076 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.467049 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/whereabouts-cni-bincopy/0.log" Apr 16 19:00:25.488644 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.488620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ll6hq_f2a54163-a62f-47da-993d-f3471a740635/whereabouts-cni/0.log" Apr 16 19:00:25.922938 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.922908 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w4gwm_e3170e08-0669-40fc-b2a2-105f865f2be9/kube-multus/0.log" Apr 16 19:00:25.995308 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:25.995270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n66hf_e8425304-94d1-408f-ac22-f5bb6adfce75/network-metrics-daemon/0.log" Apr 16 19:00:26.020230 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:26.020193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n66hf_e8425304-94d1-408f-ac22-f5bb6adfce75/kube-rbac-proxy/0.log" Apr 16 19:00:26.182831 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:26.182747 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 19:00:26.186180 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:26.186157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-b7927_765cda1d-eaf6-43b6-a926-4ad4fe965542/console-operator/1.log" Apr 16 19:00:27.557095 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.557051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/ovn-controller/0.log" Apr 16 19:00:27.592268 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.592213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/ovn-acl-logging/0.log" Apr 16 19:00:27.613426 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.613380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/kube-rbac-proxy-node/0.log" Apr 16 19:00:27.635502 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.635468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:00:27.662548 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.662521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/northd/0.log" Apr 16 19:00:27.683556 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.683527 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/nbdb/0.log" Apr 16 19:00:27.707438 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.707386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/sbdb/0.log" Apr 16 19:00:27.893458 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:27.893363 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tchmw_08cb14f4-383f-4b43-8944-b2fe93cf6dff/ovnkube-controller/0.log" Apr 16 19:00:28.975826 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:28.975786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-jxkbj_cee6ee12-c77c-4b90-a41c-75571be006dc/check-endpoints/0.log" Apr 16 19:00:29.052377 ip-10-0-137-47 kubenswrapper[2576]: I0416 19:00:29.052345 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qbq69_8837a43b-32fb-45cb-9303-bc2b56966e5f/network-check-target-container/0.log"