Apr 24 21:23:59.972212 ip-10-0-131-237 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:23:59.972225 ip-10-0-131-237 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:23:59.972235 ip-10-0-131-237 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:23:59.972540 ip-10-0-131-237 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:24:10.098476 ip-10-0-131-237 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:24:10.098491 ip-10-0-131-237 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot af718515f826440297175ba7dd913e0c -- Apr 24 21:26:25.890031 ip-10-0-131-237 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:26:26.330708 ip-10-0-131-237 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:26.330708 ip-10-0-131-237 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:26:26.330708 ip-10-0-131-237 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:26.330708 ip-10-0-131-237 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:26:26.330708 ip-10-0-131-237 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:26:26.333116 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.333025 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:26:26.336230 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336213 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:26.336230 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336230 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336233 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336237 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336240 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336243 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336246 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336249 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336252 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336254 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336257 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336260 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336267 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336270 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336273 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336275 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336278 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336280 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336284 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336288 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:26.336300 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336291 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336294 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336297 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336300 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336302 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336305 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336308 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336311 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336314 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336317 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336319 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336322 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336324 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336326 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336329 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336331 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336334 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336336 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336339 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336342 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:26.336776 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336345 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336347 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336349 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336352 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336354 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336357 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336359 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336361 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336364 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336366 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336368 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336371 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336373 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336376 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336379 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336382 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336385 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336387 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336390 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336392 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:26.337280 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336395 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336398 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336401 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336404 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336406 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336408 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336411 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336413 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336416 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336418 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336421 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336424 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336426 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336429 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336432 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336434 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336437 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336439 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336441 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336444 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:26.337932 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336446 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:26.338585 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336450 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:26.338585 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336453 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:26.338585 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336457 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:26.338585 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336461 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:26.338585 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.336464 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:26.338997 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.338981 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:26.338997 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.338996 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:26.338997 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339000 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339004 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339007 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339011 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339014 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339018 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339021 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339024 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339027 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339030 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339032 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339035 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339038 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339041 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339043 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339046 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339048 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339051 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339053 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339056 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:26.339115 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339060 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339065 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339073 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339077 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339081 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339084 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339087 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339090 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339092 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339095 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339098 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339100 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339103 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339106 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339109 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339112 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339114 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339117 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339120 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339122 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:26.339626 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339125 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339127 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339130 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339132 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339135 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339138 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339140 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339142 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339145 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339147 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339150 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339152 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339155 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339158 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339160 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339163 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339165 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339167 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339170 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339173 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:26.340198 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339176 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339179 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339182 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339184 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339187 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339190 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339193 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339195 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339198 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339200 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339203 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339205 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339208 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339210 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339213 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339215 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339226 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339229 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339232 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:26.340701 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339235 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339237 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339240 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339242 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339245 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339319 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339327 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339343 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339347 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339351 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339355 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339359 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339364 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339367 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339370 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339373 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339376 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339379 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339382 2580 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339386 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339389 2580 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339391 2580 flags.go:64] FLAG: --cloud-config="" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339394 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339397 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:26:26.341155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339402 2580 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339404 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339407 2580 flags.go:64] FLAG: --config-dir="" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339410 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339413 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339418 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339426 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339430 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339433 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339436 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339439 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339442 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339445 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339448 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339455 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339458 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339461 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339464 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339467 2580 flags.go:64] FLAG: --enable-server="true" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339470 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339477 2580 flags.go:64] FLAG: --event-burst="100" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339481 2580 flags.go:64] FLAG: --event-qps="50" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339483 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339486 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339489 2580 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:26:26.341750 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339493 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339496 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339499 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339502 2580 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339505 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339508 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339511 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339514 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339517 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339520 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339522 2580 flags.go:64] FLAG: --feature-gates="" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339526 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339529 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339532 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339541 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339544 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339547 2580 flags.go:64] FLAG: --help="false" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339550 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339553 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339556 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339559 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339562 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339565 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339568 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:26:26.342359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339571 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339574 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339577 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339580 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339583 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339586 2580 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339589 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339591 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339594 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339597 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339600 2580 flags.go:64] FLAG: --lock-file="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339603 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339606 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339609 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339615 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339618 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339620 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339623 2580 flags.go:64] FLAG: --logging-format="text" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339626 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339629 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339632 2580 flags.go:64] FLAG: --manifest-url="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339635 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339640 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339649 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339671 2580 flags.go:64] FLAG: --max-pods="110" Apr 24 21:26:26.342954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339676 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339680 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339683 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339686 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339689 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339692 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339695 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339704 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339707 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339710 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339712 2580 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339715 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339721 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339723 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339726 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339729 2580 flags.go:64] FLAG: --port="10250" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339732 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339735 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-02d16dc7ce271b42f" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339738 2580 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339741 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339745 2580 flags.go:64] FLAG: --register-node="true" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339747 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339750 2580 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339754 2580 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:26:26.343561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339756 2580 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339759 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339762 2580 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339766 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339769 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339772 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339774 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339783 2580 flags.go:64] FLAG: --runonce="false" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339786 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339789 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339792 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339794 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339797 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339800 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339803 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339806 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339809 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339812 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339815 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339817 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339820 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339823 2580 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339826 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339832 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339834 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:26:26.344209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339837 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339845 2580 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339848 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339851 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339854 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339856 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339860 2580 flags.go:64] FLAG: --v="2" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339864 2580 flags.go:64] FLAG: --version="false" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339868 2580 flags.go:64] FLAG: --vmodule="" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339873 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.339876 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339986 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339989 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339994 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.339998 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340006 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340009 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340012 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340014 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340017 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340019 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340025 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:26.344847 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340028 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340031 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340033 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340036 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340039 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340041 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340044 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340046 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340049 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340051 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340054 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340057 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340059 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340062 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340064 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340067 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340070 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340074 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340077 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:26.345386 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340080 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340082 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340085 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340088 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340090 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340093 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340095 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340115 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340118 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340120 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340123 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340126 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340130 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340133 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340135 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340139 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340142 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340145 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340147 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340150 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:26.345891 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340153 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340155 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340158 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340160 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340163 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340165 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340167 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340170 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340174 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340176 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340179 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340181 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340184 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340186 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340189 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340191 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340194 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340196 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340199 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340201 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:26.346395 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340209 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340212 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340214 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340217 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340222 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340224 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340227 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340231 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340233 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340236 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340238 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340241 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340243 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340246 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340248 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.340251 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:26.346983 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.340982 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:26.347742 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.347721 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:26:26.347782 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.347743 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347795 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347801 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347805 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347808 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347811 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:26.347814 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347815 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347820 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347825 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347829 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347833 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347836 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347839 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347842 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347852 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347855 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347858 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347861 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347863 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347866 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347869 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347872 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347874 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347877 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347879 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:26.347960 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347882 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347884 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347887 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347889 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347892 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347894 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347897 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347899 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347902 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347904 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347907 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347909 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347912 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347914 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347916 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347920 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347923 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347925 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347928 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347930 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:26.348461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347933 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347935 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347939 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347941 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347944 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347946 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347949 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347951 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347954 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347956 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347958 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347961 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347964 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347966 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347969 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347971 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347974 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347977 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347979 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347982 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:26.348970 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347986 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347988 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347991 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347993 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347995 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.347998 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348000 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348003 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348006 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348008 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348011 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348013 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348016 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348019 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348021 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348024 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348027 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348029 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348032 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348034 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:26.349467 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348037 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348040 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.348044 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348152 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348157 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348160 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348163 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348166 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348169 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348172 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348174 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348177 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348180 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348183 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348185 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348188 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:26:26.350054 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348190 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348194 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348198 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348201 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348204 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348206 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348209 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348211 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348215 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348218 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348221 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348224 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348228 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348230 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348233 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348235 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348238 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348240 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348243 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:26:26.350432 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348246 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348248 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348251 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348253 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348256 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348258 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348261 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348263 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348266 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348268 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348271 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348273 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348275 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348278 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348280 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348283 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348285 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348288 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348290 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:26:26.350910 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348293 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348295 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348298 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348300 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348303 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348305 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348307 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348310 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348313 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348315 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348318 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348320 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348322 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348325 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348327 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348329 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348332 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348335 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348337 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348339 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:26:26.351381 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348342 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348344 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348347 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348349 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348352 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348354 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348357 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348359 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348362 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348364 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348367 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348369 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348372 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348374 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:26.348376 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.348381 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:26:26.351890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.349233 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:26:26.353705 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.353690 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:26:26.355019 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.355007 2580 server.go:1019] "Starting client certificate rotation" Apr 24 21:26:26.355120 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.355103 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:26.355159 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.355141 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:26:26.381345 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.381324 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:26.384428 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.384405 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:26:26.403919 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.403888 2580 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:26:26.409514 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.409494 2580 log.go:25] "Validated CRI v1 image API" Apr 24 21:26:26.409888 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.409869 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:26.411260 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.411234 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:26:26.413696 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.413672 2580 fs.go:135] Filesystem UUIDs: map[23157f20-b885-4d47-9f2a-00545d618c39:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f9dcd1ed-3f4e-44a4-8d85-adb6cec6a4ad:/dev/nvme0n1p3] Apr 24 21:26:26.413788 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.413694 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:26:26.419950 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.419799 2580 manager.go:217] Machine: {Timestamp:2026-04-24 21:26:26.417530317 +0000 UTC m=+0.401745087 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3091989 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c9fca95a70bd2504077c96556d1fa SystemUUID:ec2c9fca-95a7-0bd2-5040-77c96556d1fa BootID:af718515-f826-4402-9717-5ba7dd913e0c Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bb:98:12:09:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bb:98:12:09:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:40:5c:49:eb:30 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:26:26.419950 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.419936 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:26:26.420099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.420087 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:26:26.422006 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.421965 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:26:26.422167 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.422010 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-237.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:26:26.422218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.422174 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:26:26.422218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.422183 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:26:26.422218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.422202 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:26.422218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.422217 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:26:26.423789 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.423774 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:26.424149 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.424136 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:26:26.427000 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.426983 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:26:26.427066 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.427023 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:26:26.427066 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.427042 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:26:26.427066 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.427063 2580 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:26:26.427157 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.427077 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:26:26.428188 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.428172 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:26.428244 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.428193 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:26:26.431260 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.431234 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:26:26.432751 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.432734 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:26:26.434814 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434799 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434820 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434827 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434833 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434841 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434848 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434855 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434860 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:26:26.434864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434868 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:26:26.435072 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434874 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:26:26.435072 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434892 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:26:26.435072 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.434902 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:26:26.436597 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.436584 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:26:26.436632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.436599 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:26:26.439925 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.439899 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-237.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:26:26.440081 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.439950 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:26:26.440146 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.440077 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:26:26.440989 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.440974 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:26:26.441047 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.441026 2580 server.go:1295] "Started kubelet" Apr 24 21:26:26.441222 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.441184 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:26:26.441280 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.441191 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:26:26.441280 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.441262 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:26:26.441982 ip-10-0-131-237 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:26:26.442630 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.442613 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:26:26.447156 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.447123 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:26:26.452157 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.449582 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-237.ec2.internal.18a96819703e8813 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-237.ec2.internal,UID:ip-10-0-131-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-237.ec2.internal,},FirstTimestamp:2026-04-24 21:26:26.440988691 +0000 UTC m=+0.425203460,LastTimestamp:2026-04-24 21:26:26.440988691 +0000 UTC m=+0.425203460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-237.ec2.internal,}" Apr 24 21:26:26.453180 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.453076 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:26:26.453180 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.453167 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:26.453722 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.453701 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:26:26.454405 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454387 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:26:26.454488 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454414 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:26:26.454488 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454480 2580 factory.go:55] Registering systemd factory Apr 24 21:26:26.454597 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454385 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:26:26.454597 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454515 2580 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:26:26.454597 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454581 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:26:26.454597 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454592 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:26:26.454842 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.454821 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.454957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454931 2580 factory.go:153] Registering CRI-O factory Apr 24 21:26:26.454957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454943 2580 factory.go:223] Registration of the crio container factory successfully Apr 24 21:26:26.455068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.454983 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:26:26.455068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.455000 2580 factory.go:103] Registering Raw factory Apr 24 21:26:26.455068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.455010 2580 manager.go:1196] Started watching for new ooms in manager Apr 24 21:26:26.455336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.455323 2580 manager.go:319] Starting recovery of all containers Apr 24 21:26:26.458047 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.458018 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jbtlw" Apr 24 21:26:26.463735 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.463551 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jbtlw" Apr 24 21:26:26.465294 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.465273 2580 manager.go:324] Recovery completed Apr 24 21:26:26.466023 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.465999 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:26:26.466120 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.466019 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:26:26.469471 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.469458 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.472116 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472093 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.472187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472133 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.472187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472146 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.472644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472630 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:26:26.472718 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472644 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:26:26.472718 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.472678 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:26:26.475979 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.475968 2580 policy_none.go:49] "None policy: Start" Apr 24 21:26:26.476021 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.475984 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:26:26.476376 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.476367 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:26:26.514421 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514399 2580 manager.go:341] "Starting Device Plugin manager" Apr 24 21:26:26.514624 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.514459 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:26:26.514624 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514474 2580 server.go:85] "Starting device plugin registration server" Apr 24 21:26:26.514933 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514800 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:26:26.514933 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514817 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:26:26.514933 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514900 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:26:26.515253 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.514990 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:26:26.515253 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.515000 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:26:26.515689 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.515652 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:26:26.515758 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.515713 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.571311 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.571256 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:26:26.572859 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.572843 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:26:26.572964 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.572870 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:26:26.572964 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.572890 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:26:26.572964 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.572897 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:26:26.572964 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.572929 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:26:26.576363 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.576343 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:26.615254 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.615169 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.616309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.616294 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.616379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.616325 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.616379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.616337 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.616379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.616360 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.625641 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.625624 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.625697 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.625647 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-237.ec2.internal\": node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.648093 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.648069 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.673813 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.673784 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal"] Apr 24 21:26:26.673885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.673858 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.674805 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.674788 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.674879 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.674822 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.674879 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.674835 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.676979 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.676966 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.677119 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677102 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.677161 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677141 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.677740 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677725 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.677793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677741 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.677793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677757 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.677793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677775 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.677889 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677760 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.677889 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.677836 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.679787 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.679773 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.679847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.679798 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:26:26.680465 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.680450 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:26:26.680533 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.680481 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:26:26.680533 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.680494 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:26:26.702456 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.702426 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-237.ec2.internal\" not found" node="ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.707203 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.707181 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-237.ec2.internal\" not found" node="ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.748676 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.748622 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.755468 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.755443 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.755631 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.755476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.755631 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.755502 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d266bc3b3ff16c73161a2d12b87975ef-config\") pod \"kube-apiserver-proxy-ip-10-0-131-237.ec2.internal\" (UID: \"d266bc3b3ff16c73161a2d12b87975ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.848989 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.848943 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:26.856312 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856286 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.856404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856320 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d266bc3b3ff16c73161a2d12b87975ef-config\") pod \"kube-apiserver-proxy-ip-10-0-131-237.ec2.internal\" (UID: \"d266bc3b3ff16c73161a2d12b87975ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.856404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.856404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856383 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.856521 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dd6742b0eb7dbf06dae244f302a2b2ad-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal\" (UID: \"dd6742b0eb7dbf06dae244f302a2b2ad\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.856521 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:26.856408 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d266bc3b3ff16c73161a2d12b87975ef-config\") pod \"kube-apiserver-proxy-ip-10-0-131-237.ec2.internal\" (UID: \"d266bc3b3ff16c73161a2d12b87975ef\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:26.949831 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:26.949747 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.005243 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.005210 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:27.009917 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.009898 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:27.050740 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:27.050710 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.151250 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:27.151212 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.251898 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:27.251797 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.352458 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:27.352420 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.354631 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.354609 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:26:27.354796 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.354778 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:26:27.453204 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:27.453172 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-237.ec2.internal\" not found" Apr 24 21:26:27.454276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.454253 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:26:27.465947 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.465892 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:21:26 +0000 UTC" deadline="2028-02-09 00:28:57.81629191 +0000 UTC" Apr 24 21:26:27.465947 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.465943 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15723h2m30.350354573s" Apr 24 21:26:27.466139 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.465971 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:26:27.484837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.484802 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrrfp" Apr 24 21:26:27.487883 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.487857 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:27.493086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.493046 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-lrrfp" Apr 24 21:26:27.542818 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.542792 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:27.546083 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:27.546053 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6742b0eb7dbf06dae244f302a2b2ad.slice/crio-284df4e34e67a01701031bddc8c4a90253ee46e0aced6f3dfc562384bff3460c WatchSource:0}: Error finding container 284df4e34e67a01701031bddc8c4a90253ee46e0aced6f3dfc562384bff3460c: Status 404 returned error can't find the container with id 284df4e34e67a01701031bddc8c4a90253ee46e0aced6f3dfc562384bff3460c Apr 24 21:26:27.546468 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:27.546450 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd266bc3b3ff16c73161a2d12b87975ef.slice/crio-04be061eefaae5434d8f2dc1fed307d46e344efc277c9dbd10945cc7fa4a5dd9 WatchSource:0}: Error finding container 04be061eefaae5434d8f2dc1fed307d46e344efc277c9dbd10945cc7fa4a5dd9: Status 404 returned error can't find the container with id 04be061eefaae5434d8f2dc1fed307d46e344efc277c9dbd10945cc7fa4a5dd9 Apr 24 21:26:27.550602 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.550578 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:26:27.554206 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.554185 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" Apr 24 21:26:27.566616 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.566594 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:27.567564 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.567551 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" Apr 24 21:26:27.573391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.573372 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:26:27.576259 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.576219 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" event={"ID":"d266bc3b3ff16c73161a2d12b87975ef","Type":"ContainerStarted","Data":"04be061eefaae5434d8f2dc1fed307d46e344efc277c9dbd10945cc7fa4a5dd9"} Apr 24 21:26:27.577122 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.577106 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" event={"ID":"dd6742b0eb7dbf06dae244f302a2b2ad","Type":"ContainerStarted","Data":"284df4e34e67a01701031bddc8c4a90253ee46e0aced6f3dfc562384bff3460c"} Apr 24 21:26:27.760073 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:27.760046 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:28.429018 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.428933 2580 apiserver.go:52] "Watching apiserver" Apr 24 21:26:28.437920 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.437893 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:26:28.440309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.440285 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-vnffz","kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal","openshift-cluster-node-tuning-operator/tuned-mhkc8","openshift-image-registry/node-ca-tcjn9","openshift-multus/multus-8hrvb","openshift-multus/network-metrics-daemon-hf9r5","openshift-network-diagnostics/network-check-target-hzw5v","openshift-network-operator/iptables-alerter-rcz8z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal","openshift-multus/multus-additional-cni-plugins-tqs9r","openshift-ovn-kubernetes/ovnkube-node-zwmjf"] Apr 24 21:26:28.444913 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.444883 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.447119 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.447094 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.449294 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.449250 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.449436 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.449363 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.449842 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.449822 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-m6fp9\"" Apr 24 21:26:28.449977 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.449822 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:26:28.450122 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.450099 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:26:28.450340 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.450320 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.450440 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.450355 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.450440 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.450415 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kjl5r\"" Apr 24 21:26:28.451652 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.451620 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.451747 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.451710 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:28.452652 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.452627 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.452893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.452876 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:26:28.453329 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.453083 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-cklfj\"" Apr 24 21:26:28.453397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.453366 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:26:28.453687 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.453650 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.453866 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.453847 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.453940 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.453873 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.454095 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.454079 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:26:28.454208 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.454193 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4ldgr\"" Apr 24 21:26:28.456919 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.456602 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:28.456919 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.456692 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:28.456919 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.456703 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.459243 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.458954 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.459243 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.459165 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.459381 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.459364 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.459700 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.459681 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f45lh\"" Apr 24 21:26:28.459843 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.459819 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:26:28.462165 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.462142 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.462284 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.462274 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:26:28.462467 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.462447 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-tc6vd\"" Apr 24 21:26:28.462545 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.462516 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.463959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.463941 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.464059 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.463969 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.466207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.466207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-agent-certs\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.466207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466172 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-conf\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wld\" (UniqueName: \"kubernetes.io/projected/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-kube-api-access-88wld\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466252 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-etc-kubernetes\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466280 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftq97\" (UniqueName: \"kubernetes.io/projected/c966a75c-1583-49c7-802b-498b767cf3f6-kube-api-access-ftq97\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-run\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466330 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-system-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466354 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-hostroot\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466376 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-tuned\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-tmp\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzc8b\" (UniqueName: \"kubernetes.io/projected/b692a988-6486-4028-b418-e5eac0cb57fb-kube-api-access-wzc8b\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-var-lib-kubelet\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466485 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-host\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466507 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466529 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-socket-dir-parent\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-k8s-cni-cncf-io\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466593 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-sys\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-serviceca\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466638 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-multus-daemon-config\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b692a988-6486-4028-b418-e5eac0cb57fb-iptables-alerter-script\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.466725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466721 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b692a988-6486-4028-b418-e5eac0cb57fb-host-slash\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhr2\" (UniqueName: \"kubernetes.io/projected/88be4377-88c5-417f-8cba-f0a7f6d5f16e-kube-api-access-sdhr2\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466786 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-modprobe-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysconfig\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466831 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-kubernetes\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466872 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-bin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-kubelet\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.466972 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-konnectivity-ca\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467006 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzfl\" (UniqueName: \"kubernetes.io/projected/a9eec410-b753-4a8a-93ff-a5c67112bf0a-kube-api-access-cvzfl\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467033 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-cnibin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467056 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-netns\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467073 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-multus\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467087 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-conf-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-multus-certs\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467120 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-systemd\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467137 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467134 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-lib-modules\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.467885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-host\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.467885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467171 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-os-release\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467203 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-cni-binary-copy\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.467885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.467232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.468899 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.468901 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.468953 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.468983 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469096 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469200 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f6cv6\"" Apr 24 21:26:28.469336 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469275 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pdvjd\"" Apr 24 21:26:28.469639 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:26:28.469741 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469730 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:26:28.469789 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.469612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:26:28.489469 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.489448 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:26:28.493726 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.493693 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:27 +0000 UTC" deadline="2027-11-24 22:33:53.407723569 +0000 UTC" Apr 24 21:26:28.493843 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.493734 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13897h7m24.914001352s" Apr 24 21:26:28.555959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.555925 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:26:28.567643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567601 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.567643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567643 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.567882 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-conf\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.567882 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88wld\" (UniqueName: \"kubernetes.io/projected/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-kube-api-access-88wld\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.567882 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-var-lib-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.567882 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567873 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.568028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-conf\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567929 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-system-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567949 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-slash\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.567983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-tuned\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568008 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzc8b\" (UniqueName: \"kubernetes.io/projected/b692a988-6486-4028-b418-e5eac0cb57fb-kube-api-access-wzc8b\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.568228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568032 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-systemd-units\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568054 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-system-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568192 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-etc-selinux\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-var-lib-kubelet\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-socket-dir-parent\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-k8s-cni-cncf-io\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568284 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-env-overrides\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568300 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkmb2\" (UniqueName: \"kubernetes.io/projected/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-kube-api-access-mkmb2\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568319 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-registration-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568345 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-multus-daemon-config\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-k8s-cni-cncf-io\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568360 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-socket-dir-parent\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b692a988-6486-4028-b418-e5eac0cb57fb-iptables-alerter-script\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.568391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568387 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-var-lib-kubelet\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b692a988-6486-4028-b418-e5eac0cb57fb-host-slash\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568379 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568438 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b692a988-6486-4028-b418-e5eac0cb57fb-host-slash\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568478 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-config\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568517 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhr2\" (UniqueName: \"kubernetes.io/projected/88be4377-88c5-417f-8cba-f0a7f6d5f16e-kube-api-access-sdhr2\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568563 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-kubernetes\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568584 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-bin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568607 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-kubelet\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568644 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-kubernetes\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568633 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-conf-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-etc-kubernetes\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-bin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-kubelet\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568724 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftq97\" (UniqueName: \"kubernetes.io/projected/c966a75c-1583-49c7-802b-498b767cf3f6-kube-api-access-ftq97\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568756 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-konnectivity-ca\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568773 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-conf-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-multus\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.568905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568757 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-etc-kubernetes\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-multus-certs\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568851 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-netns\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568859 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-var-lib-cni-multus\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568876 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-script-lib\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568901 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-system-cni-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568904 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-multus-certs\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-systemd\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.568978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-lib-modules\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-systemd\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569003 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-log-socket\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569018 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b692a988-6486-4028-b418-e5eac0cb57fb-iptables-alerter-script\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569038 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-socket-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569069 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-binary-copy\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-multus-daemon-config\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569099 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.569632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-agent-certs\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569134 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-lib-modules\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-kubelet\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.569227 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-systemd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.569301 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:29.069270718 +0000 UTC m=+3.053485475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569362 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-konnectivity-ca\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-etc-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml495\" (UniqueName: \"kubernetes.io/projected/995ed227-fb30-4b70-9c48-e4516dc0a85c-kube-api-access-ml495\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569463 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-run\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569519 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-run\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569557 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-hostroot\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-os-release\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-tmp\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569648 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-node-log\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-netd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-hostroot\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.570425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569787 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmr4\" (UniqueName: \"kubernetes.io/projected/7f722441-3b01-48ed-9900-6d96012e5c31-kube-api-access-9bmr4\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-host\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-host\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-bin\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-multus-cni-dir\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.569989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-sys\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570026 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-serviceca\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-sys\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-modprobe-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570112 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysconfig\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570171 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysconfig\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570186 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-sys-fs\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570222 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzfl\" (UniqueName: \"kubernetes.io/projected/a9eec410-b753-4a8a-93ff-a5c67112bf0a-kube-api-access-cvzfl\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-modprobe-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-cnibin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570287 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-netns\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570301 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-cnibin\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570332 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-host-run-netns\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-sysctl-d\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-serviceca\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-ovn\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570598 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-device-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570623 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-cnibin\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-host\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-os-release\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-host\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-cni-binary-copy\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.570802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c966a75c-1583-49c7-802b-498b767cf3f6-os-release\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.571837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.571099 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c966a75c-1583-49c7-802b-498b767cf3f6-cni-binary-copy\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.572499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.572076 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-etc-tuned\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.572499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.572374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e-agent-certs\") pod \"konnectivity-agent-vnffz\" (UID: \"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e\") " pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.576077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.576050 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a9eec410-b753-4a8a-93ff-a5c67112bf0a-tmp\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.578926 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.578899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wld\" (UniqueName: \"kubernetes.io/projected/6fcc9f57-5d75-40c0-88a6-9f4985a693ad-kube-api-access-88wld\") pod \"node-ca-tcjn9\" (UID: \"6fcc9f57-5d75-40c0-88a6-9f4985a693ad\") " pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.579022 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.578991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzc8b\" (UniqueName: \"kubernetes.io/projected/b692a988-6486-4028-b418-e5eac0cb57fb-kube-api-access-wzc8b\") pod \"iptables-alerter-rcz8z\" (UID: \"b692a988-6486-4028-b418-e5eac0cb57fb\") " pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.579125 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.579105 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftq97\" (UniqueName: \"kubernetes.io/projected/c966a75c-1583-49c7-802b-498b767cf3f6-kube-api-access-ftq97\") pod \"multus-8hrvb\" (UID: \"c966a75c-1583-49c7-802b-498b767cf3f6\") " pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.579183 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.579170 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzfl\" (UniqueName: \"kubernetes.io/projected/a9eec410-b753-4a8a-93ff-a5c67112bf0a-kube-api-access-cvzfl\") pod \"tuned-mhkc8\" (UID: \"a9eec410-b753-4a8a-93ff-a5c67112bf0a\") " pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.580265 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.580245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhr2\" (UniqueName: \"kubernetes.io/projected/88be4377-88c5-417f-8cba-f0a7f6d5f16e-kube-api-access-sdhr2\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:28.581729 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.581714 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:28.581782 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.581732 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:28.581782 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.581742 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:28.581877 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:28.581796 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:29.081781797 +0000 UTC m=+3.065996572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:28.671223 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671175 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-var-lib-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-slash\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-systemd-units\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671297 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671257 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-var-lib-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671321 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-etc-selinux\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671347 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-env-overrides\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-systemd-units\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkmb2\" (UniqueName: \"kubernetes.io/projected/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-kube-api-access-mkmb2\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-etc-selinux\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-registration-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671487 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-config\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671502 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-registration-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671314 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-netns\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-script-lib\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-system-cni-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671597 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-log-socket\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-socket-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-binary-copy\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671348 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-slash\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671743 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-system-cni-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-kubelet\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671764 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-log-socket\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.671897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671791 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-systemd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671820 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-etc-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-run-netns\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml495\" (UniqueName: \"kubernetes.io/projected/995ed227-fb30-4b70-9c48-e4516dc0a85c-kube-api-access-ml495\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671866 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-systemd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-os-release\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671904 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-node-log\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671907 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-kubelet\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-netd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671954 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmr4\" (UniqueName: \"kubernetes.io/projected/7f722441-3b01-48ed-9900-6d96012e5c31-kube-api-access-9bmr4\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.671982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-bin\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672011 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-sys-fs\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672047 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672059 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-config\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672153 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-socket-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-etc-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.672748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672216 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-binary-copy\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672244 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-ovn\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672271 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672307 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-os-release\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672333 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovnkube-script-lib\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-device-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-node-log\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-bin\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672374 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-cni-netd\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672395 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-ovn\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672394 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-cnibin\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672443 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-device-dir\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/995ed227-fb30-4b70-9c48-e4516dc0a85c-cnibin\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672487 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672533 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-run-openvswitch\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673822 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7f722441-3b01-48ed-9900-6d96012e5c31-sys-fs\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.673822 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.672895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-env-overrides\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.673822 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.673039 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.673822 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.673100 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/995ed227-fb30-4b70-9c48-e4516dc0a85c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.674776 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.674758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.682328 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.682273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmr4\" (UniqueName: \"kubernetes.io/projected/7f722441-3b01-48ed-9900-6d96012e5c31-kube-api-access-9bmr4\") pod \"aws-ebs-csi-driver-node-x9xf9\" (UID: \"7f722441-3b01-48ed-9900-6d96012e5c31\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.682471 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.682429 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml495\" (UniqueName: \"kubernetes.io/projected/995ed227-fb30-4b70-9c48-e4516dc0a85c-kube-api-access-ml495\") pod \"multus-additional-cni-plugins-tqs9r\" (UID: \"995ed227-fb30-4b70-9c48-e4516dc0a85c\") " pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.683099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.683079 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkmb2\" (UniqueName: \"kubernetes.io/projected/be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe-kube-api-access-mkmb2\") pod \"ovnkube-node-zwmjf\" (UID: \"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:28.756622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.756576 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:28.765465 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.765433 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tcjn9" Apr 24 21:26:28.774310 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.774279 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" Apr 24 21:26:28.778101 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.778072 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hrvb" Apr 24 21:26:28.785724 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.785696 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rcz8z" Apr 24 21:26:28.792416 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.792388 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" Apr 24 21:26:28.799080 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.799057 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" Apr 24 21:26:28.803786 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:28.803768 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:29.074903 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.074821 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:29.075059 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.074973 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:29.075059 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.075042 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.075022602 +0000 UTC m=+4.059237362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:29.175343 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.175309 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:29.175489 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.175460 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:29.175489 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.175477 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:29.175489 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.175486 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:29.175581 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.175550 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.175535557 +0000 UTC m=+4.159750317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:29.210167 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.210138 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ae4eef7_d0e0_4916_bb0a_0752f2af5c3e.slice/crio-4e2b1500656830e2addad80932da0c770c8df1a3c9cd42e3178a5c4c4cc43b8f WatchSource:0}: Error finding container 4e2b1500656830e2addad80932da0c770c8df1a3c9cd42e3178a5c4c4cc43b8f: Status 404 returned error can't find the container with id 4e2b1500656830e2addad80932da0c770c8df1a3c9cd42e3178a5c4c4cc43b8f Apr 24 21:26:29.211461 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.211435 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9fed09_b2c0_4c7e_a5be_a42ee2e2edfe.slice/crio-fe118726bcf8aa9e12ab5280858e7079f8c81d8d9f1f5eef5369b78ac62c4412 WatchSource:0}: Error finding container fe118726bcf8aa9e12ab5280858e7079f8c81d8d9f1f5eef5369b78ac62c4412: Status 404 returned error can't find the container with id fe118726bcf8aa9e12ab5280858e7079f8c81d8d9f1f5eef5369b78ac62c4412 Apr 24 21:26:29.214645 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.214621 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fcc9f57_5d75_40c0_88a6_9f4985a693ad.slice/crio-114064ff4c056b4c5820ba9ef5223df78d4b74915b68656a2cfdb769f44e2a8b WatchSource:0}: Error finding container 114064ff4c056b4c5820ba9ef5223df78d4b74915b68656a2cfdb769f44e2a8b: Status 404 returned error can't find the container with id 114064ff4c056b4c5820ba9ef5223df78d4b74915b68656a2cfdb769f44e2a8b Apr 24 21:26:29.214934 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.214890 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f722441_3b01_48ed_9900_6d96012e5c31.slice/crio-c83c2bb279b9b18d41588c0a31172093f6b56038f58d23709e164501ba10b216 WatchSource:0}: Error finding container c83c2bb279b9b18d41588c0a31172093f6b56038f58d23709e164501ba10b216: Status 404 returned error can't find the container with id c83c2bb279b9b18d41588c0a31172093f6b56038f58d23709e164501ba10b216 Apr 24 21:26:29.215364 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.215334 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb692a988_6486_4028_b418_e5eac0cb57fb.slice/crio-25ad1e8e9b7d111b7203ea8406c55cde15ac8690198a62023ec29edf913426a1 WatchSource:0}: Error finding container 25ad1e8e9b7d111b7203ea8406c55cde15ac8690198a62023ec29edf913426a1: Status 404 returned error can't find the container with id 25ad1e8e9b7d111b7203ea8406c55cde15ac8690198a62023ec29edf913426a1 Apr 24 21:26:29.217578 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:29.217558 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc966a75c_1583_49c7_802b_498b767cf3f6.slice/crio-d2341b6ff29693d5e18d30e53fbc778f3f2412aa65687bb9dd0d4d5286263da7 WatchSource:0}: Error finding container d2341b6ff29693d5e18d30e53fbc778f3f2412aa65687bb9dd0d4d5286263da7: Status 404 returned error can't find the container with id d2341b6ff29693d5e18d30e53fbc778f3f2412aa65687bb9dd0d4d5286263da7 Apr 24 21:26:29.494477 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.494191 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:21:27 +0000 UTC" deadline="2027-10-15 07:37:40.783939663 +0000 UTC" Apr 24 21:26:29.494477 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.494393 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12922h11m11.289550166s" Apr 24 21:26:29.573906 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.573864 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:29.574063 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.574034 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:29.581800 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.581742 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vqqh6"] Apr 24 21:26:29.584699 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.584671 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.584866 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.584762 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:29.586749 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.586705 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" event={"ID":"d266bc3b3ff16c73161a2d12b87975ef","Type":"ContainerStarted","Data":"2bda7d07c9895e351855cdab1d19c96f9d49d190932e5b03359e653be926d945"} Apr 24 21:26:29.589058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.589031 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerStarted","Data":"ddb66686faccdaf4e7832164fb71d871fcb6aa08b93e22f25bb2e8e54969cbc9"} Apr 24 21:26:29.591460 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.591432 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hrvb" event={"ID":"c966a75c-1583-49c7-802b-498b767cf3f6","Type":"ContainerStarted","Data":"d2341b6ff29693d5e18d30e53fbc778f3f2412aa65687bb9dd0d4d5286263da7"} Apr 24 21:26:29.594722 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.594693 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" event={"ID":"7f722441-3b01-48ed-9900-6d96012e5c31","Type":"ContainerStarted","Data":"c83c2bb279b9b18d41588c0a31172093f6b56038f58d23709e164501ba10b216"} Apr 24 21:26:29.599351 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.599320 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tcjn9" event={"ID":"6fcc9f57-5d75-40c0-88a6-9f4985a693ad","Type":"ContainerStarted","Data":"114064ff4c056b4c5820ba9ef5223df78d4b74915b68656a2cfdb769f44e2a8b"} Apr 24 21:26:29.606546 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.606511 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"fe118726bcf8aa9e12ab5280858e7079f8c81d8d9f1f5eef5369b78ac62c4412"} Apr 24 21:26:29.609120 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.609070 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vnffz" event={"ID":"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e","Type":"ContainerStarted","Data":"4e2b1500656830e2addad80932da0c770c8df1a3c9cd42e3178a5c4c4cc43b8f"} Apr 24 21:26:29.613110 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.613045 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rcz8z" event={"ID":"b692a988-6486-4028-b418-e5eac0cb57fb","Type":"ContainerStarted","Data":"25ad1e8e9b7d111b7203ea8406c55cde15ac8690198a62023ec29edf913426a1"} Apr 24 21:26:29.617644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.617613 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" event={"ID":"a9eec410-b753-4a8a-93ff-a5c67112bf0a","Type":"ContainerStarted","Data":"a2484bcd79f63077ec72d4201c47a229709af833f6d50b21f998b217053f482e"} Apr 24 21:26:29.623099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.621727 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-237.ec2.internal" podStartSLOduration=2.621703484 podStartE2EDuration="2.621703484s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:29.621261528 +0000 UTC m=+3.605476311" watchObservedRunningTime="2026-04-24 21:26:29.621703484 +0000 UTC m=+3.605918265" Apr 24 21:26:29.680598 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.680415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.680598 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.680471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-dbus\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.680598 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.680526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-kubelet-config\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.781296 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.780943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-dbus\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.781296 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.781022 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-kubelet-config\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.781296 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.781079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.781296 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.781198 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:29.781296 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:29.781259 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:26:30.281240782 +0000 UTC m=+4.265455543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:29.781641 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.781578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-dbus\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:29.781700 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:29.781652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4fcf0e65-bdc1-401b-98e2-00ff3294162f-kubelet-config\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:30.083977 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.083599 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:30.083977 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.083763 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.083977 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.083824 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:32.083804267 +0000 UTC m=+6.068019032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:30.185084 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.184407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:30.185084 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.184621 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:30.185084 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.184641 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:30.185084 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.184672 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.185084 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.184738 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:32.184718403 +0000 UTC m=+6.168933162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:30.286149 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.285566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:30.286149 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.285740 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:30.286149 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.285806 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:26:31.285786178 +0000 UTC m=+5.270000949 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:30.576741 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.576226 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:30.576741 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:30.576369 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:30.632066 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.632029 2580 generic.go:358] "Generic (PLEG): container finished" podID="dd6742b0eb7dbf06dae244f302a2b2ad" containerID="ede16b48fd5a1c438ea5304156071641d5bb6f47b0f4911dec87e8b6e73a7352" exitCode=0 Apr 24 21:26:30.633302 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:30.632953 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" event={"ID":"dd6742b0eb7dbf06dae244f302a2b2ad","Type":"ContainerDied","Data":"ede16b48fd5a1c438ea5304156071641d5bb6f47b0f4911dec87e8b6e73a7352"} Apr 24 21:26:31.295487 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:31.294798 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:31.295487 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:31.294995 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:31.295487 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:31.295061 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:26:33.295042669 +0000 UTC m=+7.279257443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:31.574036 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:31.573954 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:31.574210 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:31.574084 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:31.574516 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:31.574497 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:31.574630 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:31.574611 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:31.637379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:31.636717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" event={"ID":"dd6742b0eb7dbf06dae244f302a2b2ad","Type":"ContainerStarted","Data":"23c8210eaf5cb498e07eeac5db84aa35257c3daa95e3e0e02c170164e54d1055"} Apr 24 21:26:32.100935 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:32.100889 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:32.101109 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.101072 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:32.101188 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.101135 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:36.101115345 +0000 UTC m=+10.085330110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:32.202196 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:32.202160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:32.202378 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.202332 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:32.202378 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.202354 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:32.202378 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.202367 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:32.202524 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.202424 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:36.202405478 +0000 UTC m=+10.186620241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:32.576702 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:32.576149 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:32.576702 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:32.576287 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:33.313932 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:33.313620 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:33.313932 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:33.313860 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:33.313932 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:33.313930 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:26:37.3139105 +0000 UTC m=+11.298125271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:33.573619 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:33.573530 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:33.573856 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:33.573738 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:33.574113 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:33.574094 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:33.574242 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:33.574220 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:34.573853 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:34.573809 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:34.574316 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:34.573964 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:35.573846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:35.573812 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:35.574080 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:35.573940 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:35.574404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:35.574347 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:35.574461 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:35.574441 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:36.139959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:36.139924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:36.140149 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.140090 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:36.140224 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.140164 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:26:44.140142738 +0000 UTC m=+18.124357500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:36.241174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:36.240574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:36.241174 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.240788 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:36.241174 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.240806 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:36.241174 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.240819 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:36.241174 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.240870 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:26:44.240856129 +0000 UTC m=+18.225070888 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:36.574898 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:36.574864 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:36.575256 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:36.574969 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:37.350160 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:37.350116 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:37.350330 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:37.350261 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:37.350330 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:37.350320 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:26:45.35030576 +0000 UTC m=+19.334520521 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:37.573334 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:37.573295 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:37.573334 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:37.573327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:37.573563 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:37.573446 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:37.573860 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:37.573834 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:38.573744 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:38.573639 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:38.574178 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:38.573799 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:39.573694 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:39.573633 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:39.573694 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:39.573681 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:39.574146 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:39.573773 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:39.574146 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:39.573848 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:40.574187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:40.574150 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:40.574681 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:40.574281 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:41.573712 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:41.573652 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:41.573906 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:41.573811 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:41.574011 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:41.573651 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:41.574144 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:41.574120 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:42.573457 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:42.573407 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:42.574069 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:42.573562 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:43.573235 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:43.573190 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:43.573447 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:43.573318 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:43.573447 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:43.573391 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:43.573813 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:43.573509 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:44.197621 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:44.197581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:44.197818 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.197753 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:44.197939 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.197833 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.197809918 +0000 UTC m=+34.182024677 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:26:44.298417 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:44.298372 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:44.298567 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.298520 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:26:44.298567 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.298537 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:26:44.298567 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.298546 2580 projected.go:194] Error preparing data for projected volume kube-api-access-fc7wh for pod openshift-network-diagnostics/network-check-target-hzw5v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:44.298732 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.298603 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh podName:7e98eeca-23eb-4e4c-b591-118f914a93a1 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.298584028 +0000 UTC m=+34.282798786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fc7wh" (UniqueName: "kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh") pod "network-check-target-hzw5v" (UID: "7e98eeca-23eb-4e4c-b591-118f914a93a1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:26:44.573853 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:44.573750 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:44.574258 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:44.573883 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:45.406549 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:45.406506 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:45.406738 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:45.406636 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:45.406738 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:45.406706 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret podName:4fcf0e65-bdc1-401b-98e2-00ff3294162f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.406690741 +0000 UTC m=+35.390905500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret") pod "global-pull-secret-syncer-vqqh6" (UID: "4fcf0e65-bdc1-401b-98e2-00ff3294162f") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:26:45.573944 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:45.573902 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:45.573944 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:45.573937 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:45.574398 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:45.574026 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:45.574398 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:45.574101 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:46.575356 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.575120 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:46.575709 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:46.575469 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:46.664286 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.663978 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerStarted","Data":"279e952124b96dcfdd7d460d3cf0eb260d8dd6943d449eaf7df108bf40dafff0"} Apr 24 21:26:46.665284 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.665259 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hrvb" event={"ID":"c966a75c-1583-49c7-802b-498b767cf3f6","Type":"ContainerStarted","Data":"67e8eebca3cf8d0a9eff0a45ce608c02abaf02621a45d944aae3f86e9ababddb"} Apr 24 21:26:46.666391 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.666363 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" event={"ID":"7f722441-3b01-48ed-9900-6d96012e5c31","Type":"ContainerStarted","Data":"a0e4aa5e3173fc0a16923659e78fef283644bbf5c2da2dbe13b1143cb0edd09c"} Apr 24 21:26:46.667936 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.667906 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tcjn9" event={"ID":"6fcc9f57-5d75-40c0-88a6-9f4985a693ad","Type":"ContainerStarted","Data":"b221cd2403c288cd34d21146665a886e03309f721cf4a7d58ce74fc39f8572bb"} Apr 24 21:26:46.669753 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.669721 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vnffz" event={"ID":"7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e","Type":"ContainerStarted","Data":"307626fd2873c7d533a902b3990257ab7993f8f1e228bbd523a41b085bd525a5"} Apr 24 21:26:46.670833 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.670811 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" event={"ID":"a9eec410-b753-4a8a-93ff-a5c67112bf0a","Type":"ContainerStarted","Data":"43d9366d25a49cb7e43eb6f6155de69aad42d977714b3cd5583b5695626b24a4"} Apr 24 21:26:46.684413 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.684371 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-237.ec2.internal" podStartSLOduration=19.684339178 podStartE2EDuration="19.684339178s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:31.651983602 +0000 UTC m=+5.636198381" watchObservedRunningTime="2026-04-24 21:26:46.684339178 +0000 UTC m=+20.668553935" Apr 24 21:26:46.714309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.714261 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mhkc8" podStartSLOduration=3.598664972 podStartE2EDuration="20.714247978s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.216210093 +0000 UTC m=+3.200424857" lastFinishedPulling="2026-04-24 21:26:46.331793104 +0000 UTC m=+20.316007863" observedRunningTime="2026-04-24 21:26:46.701019967 +0000 UTC m=+20.685234745" watchObservedRunningTime="2026-04-24 21:26:46.714247978 +0000 UTC m=+20.698462758" Apr 24 21:26:46.714453 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.714355 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tcjn9" podStartSLOduration=3.599979132 podStartE2EDuration="20.714350321s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.217438939 +0000 UTC m=+3.201653696" lastFinishedPulling="2026-04-24 21:26:46.331810128 +0000 UTC m=+20.316024885" observedRunningTime="2026-04-24 21:26:46.713863703 +0000 UTC m=+20.698078494" watchObservedRunningTime="2026-04-24 21:26:46.714350321 +0000 UTC m=+20.698565100" Apr 24 21:26:46.736697 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.736627 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8hrvb" podStartSLOduration=3.594588426 podStartE2EDuration="20.736612706s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.221349831 +0000 UTC m=+3.205564593" lastFinishedPulling="2026-04-24 21:26:46.363374112 +0000 UTC m=+20.347588873" observedRunningTime="2026-04-24 21:26:46.736522957 +0000 UTC m=+20.720737735" watchObservedRunningTime="2026-04-24 21:26:46.736612706 +0000 UTC m=+20.720827483" Apr 24 21:26:46.776530 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:46.776472 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vnffz" podStartSLOduration=3.656591188 podStartE2EDuration="20.776455502s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.211976675 +0000 UTC m=+3.196191449" lastFinishedPulling="2026-04-24 21:26:46.331840995 +0000 UTC m=+20.316055763" observedRunningTime="2026-04-24 21:26:46.775841226 +0000 UTC m=+20.760056001" watchObservedRunningTime="2026-04-24 21:26:46.776455502 +0000 UTC m=+20.760670278" Apr 24 21:26:47.468147 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.467931 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:26:47.525362 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.525281 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:26:47.468126635Z","UUID":"a1ee0351-25df-4154-86c1-c5a44bbf6ed5","Handler":null,"Name":"","Endpoint":""} Apr 24 21:26:47.527638 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.527403 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:26:47.527638 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.527433 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:26:47.573627 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.573594 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:47.573627 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.573622 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:47.573832 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:47.573728 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:47.573870 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:47.573839 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:47.675007 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.674980 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="279e952124b96dcfdd7d460d3cf0eb260d8dd6943d449eaf7df108bf40dafff0" exitCode=0 Apr 24 21:26:47.675742 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.675058 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"279e952124b96dcfdd7d460d3cf0eb260d8dd6943d449eaf7df108bf40dafff0"} Apr 24 21:26:47.676639 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.676614 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" event={"ID":"7f722441-3b01-48ed-9900-6d96012e5c31","Type":"ContainerStarted","Data":"776185f763ba64e24f6b5a67317a4490424d293f0dc7a55552a5f68c95b6860d"} Apr 24 21:26:47.679478 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"980b94bea89822d4577aa86378e1198147816e3cafd8c37503cbd2c2c7d4afad"} Apr 24 21:26:47.679599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679483 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"1ce1fbe3878ec776f2f76ff38f7f6ca92d9b0a1b9117c1480b0ce054cc2cf1b0"} Apr 24 21:26:47.679599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"fbcd7acab0ece10bdf1df3a43cf4f8767cc8cd6b24ddbcb5897c7ac8d7764e71"} Apr 24 21:26:47.679599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679503 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"66731334ae0bd4a760051c221cfd0ff1e379f50db10f93b95fb5e2a3dae26a84"} Apr 24 21:26:47.679599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679510 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"4096104bbefa755718d11e2969785f507e23ea09faa1700dd2778fdca678e183"} Apr 24 21:26:47.679599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:47.679518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"87af4d6b60eca0192fb0bc15292dd599485cfb707be13ad6b7ddfd6ca4652174"} Apr 24 21:26:48.573496 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:48.573382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:48.573744 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:48.573580 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:48.682992 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:48.682951 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" event={"ID":"7f722441-3b01-48ed-9900-6d96012e5c31","Type":"ContainerStarted","Data":"056a7edbeda10038f4db8eef57b05f26dcbe92c003584537fb38ffcd2ff5d505"} Apr 24 21:26:48.684411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:48.684376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rcz8z" event={"ID":"b692a988-6486-4028-b418-e5eac0cb57fb","Type":"ContainerStarted","Data":"94fa3c06ef5de8cc655c13fef488bfdcbcb393c7056839bfc9680a2f67c67341"} Apr 24 21:26:48.732223 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:48.732175 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rcz8z" podStartSLOduration=5.6211152680000005 podStartE2EDuration="22.732162248s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.22076479 +0000 UTC m=+3.204979546" lastFinishedPulling="2026-04-24 21:26:46.331811758 +0000 UTC m=+20.316026526" observedRunningTime="2026-04-24 21:26:48.731712614 +0000 UTC m=+22.715927395" watchObservedRunningTime="2026-04-24 21:26:48.732162248 +0000 UTC m=+22.716377026" Apr 24 21:26:48.732403 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:48.732292 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x9xf9" podStartSLOduration=3.573159225 podStartE2EDuration="22.732282871s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.217153608 +0000 UTC m=+3.201368378" lastFinishedPulling="2026-04-24 21:26:48.376277266 +0000 UTC m=+22.360492024" observedRunningTime="2026-04-24 21:26:48.714500951 +0000 UTC m=+22.698715732" watchObservedRunningTime="2026-04-24 21:26:48.732282871 +0000 UTC m=+22.716497648" Apr 24 21:26:49.573250 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.573215 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:49.573380 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.573263 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:49.573380 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:49.573344 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:49.573488 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:49.573460 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:49.576273 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.576251 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:49.576994 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.576798 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:49.690639 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.690574 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"28debf80985f678b44a97fc21f1fbb924f1b2bb791aff02975259346778cc2e8"} Apr 24 21:26:49.691284 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.690874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:49.692777 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:49.692734 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vnffz" Apr 24 21:26:50.573575 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:50.573398 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:50.573785 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:50.573704 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:51.573433 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:51.573397 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:51.573953 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:51.573396 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:51.573953 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:51.573527 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:51.573953 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:51.573611 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:52.573846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.573581 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:52.574512 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:52.573969 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:52.698824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.698783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" event={"ID":"be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe","Type":"ContainerStarted","Data":"032c2c7adec3d1cfa6904e24423c3d616456fb00a167bbc4f95b3a2f928665e3"} Apr 24 21:26:52.699123 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.699100 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:52.700469 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.700437 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="8310fba5eb0a7fb607af5f8e0a87a07279a5f53819c23cc64db2788d648e411c" exitCode=0 Apr 24 21:26:52.700563 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.700475 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"8310fba5eb0a7fb607af5f8e0a87a07279a5f53819c23cc64db2788d648e411c"} Apr 24 21:26:52.715308 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.715284 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:52.725356 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:52.725315 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" podStartSLOduration=9.098117198 podStartE2EDuration="26.72530314s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.213420782 +0000 UTC m=+3.197635548" lastFinishedPulling="2026-04-24 21:26:46.840606732 +0000 UTC m=+20.824821490" observedRunningTime="2026-04-24 21:26:52.723924522 +0000 UTC m=+26.708139301" watchObservedRunningTime="2026-04-24 21:26:52.72530314 +0000 UTC m=+26.709517919" Apr 24 21:26:53.573488 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.573460 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:53.573587 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.573491 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:53.573587 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:53.573562 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:53.573722 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:53.573629 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:53.703646 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.703549 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vqqh6"] Apr 24 21:26:53.704325 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.704298 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="6579222e103a1df775abd9949ce05ec30030eb49bf78e1004f71c765e8663963" exitCode=0 Apr 24 21:26:53.704420 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.704384 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"6579222e103a1df775abd9949ce05ec30030eb49bf78e1004f71c765e8663963"} Apr 24 21:26:53.704420 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.704400 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:53.704614 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:53.704595 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:53.704737 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.704724 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:26:53.705531 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.705151 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:53.706576 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.706363 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hzw5v"] Apr 24 21:26:53.706576 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.706451 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:53.706576 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:53.706528 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:53.706970 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.706950 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hf9r5"] Apr 24 21:26:53.707072 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.707054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:53.707180 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:53.707158 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:53.722042 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:53.722018 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:54.058282 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.058195 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kp4z8"] Apr 24 21:26:54.060939 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.060919 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.063201 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.063179 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:26:54.063569 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.063552 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:26:54.063768 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.063750 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bhxp\"" Apr 24 21:26:54.174207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.174163 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d58b512-cb70-43b1-ac41-1be3111f0ccc-tmp-dir\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.174359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.174277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d58b512-cb70-43b1-ac41-1be3111f0ccc-hosts-file\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.174359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.174344 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvpj\" (UniqueName: \"kubernetes.io/projected/8d58b512-cb70-43b1-ac41-1be3111f0ccc-kube-api-access-hpvpj\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.275469 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.275430 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d58b512-cb70-43b1-ac41-1be3111f0ccc-tmp-dir\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.275649 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.275503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d58b512-cb70-43b1-ac41-1be3111f0ccc-hosts-file\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.275649 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.275547 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvpj\" (UniqueName: \"kubernetes.io/projected/8d58b512-cb70-43b1-ac41-1be3111f0ccc-kube-api-access-hpvpj\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.275649 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.275636 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8d58b512-cb70-43b1-ac41-1be3111f0ccc-hosts-file\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.276211 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.276194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8d58b512-cb70-43b1-ac41-1be3111f0ccc-tmp-dir\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.285695 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.285648 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvpj\" (UniqueName: \"kubernetes.io/projected/8d58b512-cb70-43b1-ac41-1be3111f0ccc-kube-api-access-hpvpj\") pod \"node-resolver-kp4z8\" (UID: \"8d58b512-cb70-43b1-ac41-1be3111f0ccc\") " pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.371344 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.371318 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kp4z8" Apr 24 21:26:54.378419 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:26:54.378392 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d58b512_cb70_43b1_ac41_1be3111f0ccc.slice/crio-01d375156d0ae12fe1d350adcba88c044f034437896242ca2b3b727d888f54f8 WatchSource:0}: Error finding container 01d375156d0ae12fe1d350adcba88c044f034437896242ca2b3b727d888f54f8: Status 404 returned error can't find the container with id 01d375156d0ae12fe1d350adcba88c044f034437896242ca2b3b727d888f54f8 Apr 24 21:26:54.708537 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.708362 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="9b15d6c93d23ec5a05ef712eabe5e04932bf81280af19b134f37b7f16185225e" exitCode=0 Apr 24 21:26:54.708537 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.708441 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"9b15d6c93d23ec5a05ef712eabe5e04932bf81280af19b134f37b7f16185225e"} Apr 24 21:26:54.712303 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.712277 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kp4z8" event={"ID":"8d58b512-cb70-43b1-ac41-1be3111f0ccc","Type":"ContainerStarted","Data":"b294a84c36e7e844f3348def5498bb8e68d263a52031308241ab76af94591b3a"} Apr 24 21:26:54.712393 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.712318 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kp4z8" event={"ID":"8d58b512-cb70-43b1-ac41-1be3111f0ccc","Type":"ContainerStarted","Data":"01d375156d0ae12fe1d350adcba88c044f034437896242ca2b3b727d888f54f8"} Apr 24 21:26:54.712448 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.712419 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:26:54.744461 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:54.744400 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kp4z8" podStartSLOduration=0.744382195 podStartE2EDuration="744.382195ms" podCreationTimestamp="2026-04-24 21:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:26:54.743874999 +0000 UTC m=+28.728089779" watchObservedRunningTime="2026-04-24 21:26:54.744382195 +0000 UTC m=+28.728596971" Apr 24 21:26:55.573865 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:55.573643 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:55.574055 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:55.573716 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:55.574055 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:55.573975 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:55.574055 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:55.573749 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:55.574184 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:55.574041 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:55.574184 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:55.574130 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:55.714407 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:55.714319 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:26:57.574122 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.574080 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:57.574122 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.574112 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:57.574823 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.574206 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:57.574823 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:57.574222 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vqqh6" podUID="4fcf0e65-bdc1-401b-98e2-00ff3294162f" Apr 24 21:26:57.574823 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:57.574318 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hf9r5" podUID="88be4377-88c5-417f-8cba-f0a7f6d5f16e" Apr 24 21:26:57.574823 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:57.574366 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hzw5v" podUID="7e98eeca-23eb-4e4c-b591-118f914a93a1" Apr 24 21:26:57.586793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.586752 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:26:57.587060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.587045 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:26:57.605384 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.605327 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" podUID="be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 21:26:57.616362 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:57.616315 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" podUID="be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 21:26:59.343376 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.343348 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-237.ec2.internal" event="NodeReady" Apr 24 21:26:59.344030 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.343522 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:26:59.391168 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.391134 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr"] Apr 24 21:26:59.417342 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.416749 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:26:59.435947 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.435891 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v"] Apr 24 21:26:59.436164 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.436035 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.438404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.438373 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.438681 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.438636 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:26:59.438777 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.438761 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:26:59.439104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.439073 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.442683 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.442642 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wbrg7\"" Apr 24 21:26:59.451548 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.450161 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5"] Apr 24 21:26:59.451548 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.450269 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.451548 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.450382 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.452778 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.452756 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qbk7c\"" Apr 24 21:26:59.452886 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.452820 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-skpc5\"" Apr 24 21:26:59.452957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.452881 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:26:59.453426 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.453129 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:26:59.453426 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.453207 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.453426 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.453210 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.453618 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.453509 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:26:59.453618 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.453559 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:26:59.460646 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.460604 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:26:59.468190 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.468164 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-66f8f496b4-4gdhz"] Apr 24 21:26:59.468371 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.468354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.470959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.470736 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.470959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.470749 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-dpvvp\"" Apr 24 21:26:59.470959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.470835 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:26:59.470959 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.470922 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:26:59.471233 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.471157 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.486710 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.486686 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns"] Apr 24 21:26:59.486874 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.486785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.490553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490528 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:26:59.490553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490545 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.490756 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490589 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.490756 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490547 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:26:59.490756 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490537 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:26:59.490890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.490838 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:26:59.491058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.491039 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sx8rz\"" Apr 24 21:26:59.506937 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.506895 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr"] Apr 24 21:26:59.506937 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.506932 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:26:59.507152 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.506950 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8"] Apr 24 21:26:59.507152 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.507029 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.510647 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.510612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.510804 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.510787 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.511136 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.511114 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bjmj7\"" Apr 24 21:26:59.511359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.511345 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:26:59.511848 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.511827 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:26:59.512229 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512205 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512245 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512270 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87pb2\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.512412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512324 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.512412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512340 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512384 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512459 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512492 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzd5\" (UniqueName: \"kubernetes.io/projected/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-kube-api-access-wzzd5\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.512559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512532 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512553 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.512620 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgxfm\" (UniqueName: \"kubernetes.io/projected/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-kube-api-access-fgxfm\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.512674 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.512622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.521040 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.521016 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lfbbd"] Apr 24 21:26:59.521170 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.521153 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" Apr 24 21:26:59.526502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.526478 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xkvcz\"" Apr 24 21:26:59.526502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.526486 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.526693 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.526551 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.537220 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.537197 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v"] Apr 24 21:26:59.537340 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.537226 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xngnk"] Apr 24 21:26:59.537397 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.537360 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.539613 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.539591 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:26:59.539848 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.539830 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfvbj\"" Apr 24 21:26:59.540370 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.540353 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:26:59.544754 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.544728 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns"] Apr 24 21:26:59.544754 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.544756 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5"] Apr 24 21:26:59.544902 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.544770 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh"] Apr 24 21:26:59.544902 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.544878 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.548308 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.548291 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:26:59.548918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.548865 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kd9tz\"" Apr 24 21:26:59.548918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.548906 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:26:59.549229 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.549201 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.549229 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.549219 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.553480 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.553458 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:26:59.558997 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.558974 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6rdb2"] Apr 24 21:26:59.559164 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.559131 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.561816 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.561790 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qc8h5\"" Apr 24 21:26:59.562332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.562303 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:26:59.564432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.564412 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:26:59.571419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.571399 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7mllt"] Apr 24 21:26:59.571583 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.571564 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.574843 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.574812 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:26:59.574982 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.574930 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-gxkhc\"" Apr 24 21:26:59.575082 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.575059 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:26:59.575082 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.575071 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.575879 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.575861 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.583867 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.583844 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:26:59.586096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.586065 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f"] Apr 24 21:26:59.586234 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.586185 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.586234 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.586205 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:26:59.586234 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.586205 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:26:59.586674 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.586630 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:26:59.588156 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.588133 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:26:59.588349 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.588323 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vd7zr\"" Apr 24 21:26:59.588530 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.588512 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fgjf8\"" Apr 24 21:26:59.589206 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589125 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:26:59.589206 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589143 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.589412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589393 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.589500 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589426 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:26:59.589984 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589833 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s9lwz\"" Apr 24 21:26:59.589984 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.589867 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:26:59.590329 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.590310 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:26:59.601144 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601117 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66f8f496b4-4gdhz"] Apr 24 21:26:59.601307 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601293 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f"] Apr 24 21:26:59.601409 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601388 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" Apr 24 21:26:59.601409 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601402 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lfbbd"] Apr 24 21:26:59.601557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601417 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7mllt"] Apr 24 21:26:59.601557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601428 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8"] Apr 24 21:26:59.601557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601440 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6rdb2"] Apr 24 21:26:59.601557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601451 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xngnk"] Apr 24 21:26:59.601557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.601464 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh"] Apr 24 21:26:59.603827 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.603806 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jkvvn\"" Apr 24 21:26:59.613759 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613732 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.613875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613785 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.613875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-stats-auth\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.613875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613835 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgxfm\" (UniqueName: \"kubernetes.io/projected/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-kube-api-access-fgxfm\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.613875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87pb2\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613956 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/508d1a61-2ffa-465f-adcc-b555600791a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.613965 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.613983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614141 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.614127 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.114108972 +0000 UTC m=+34.098323729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-config\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614176 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdbf\" (UniqueName: \"kubernetes.io/projected/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-kube-api-access-4rdbf\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e1461942-917b-4737-86c2-fbe05a16beae-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac4d827d-3c57-468f-bb34-d01bb87a171e-tmp-dir\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614357 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7ghz\" (UniqueName: \"kubernetes.io/projected/ac4d827d-3c57-468f-bb34-d01bb87a171e-kube-api-access-s7ghz\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.614411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614434 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-default-certificate\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.614554 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.614571 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdps8\" (UniqueName: \"kubernetes.io/projected/508d1a61-2ffa-465f-adcc-b555600791a5-kube-api-access-xdps8\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.614717 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.114698271 +0000 UTC m=+34.098913029 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-serving-cert\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.614829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614820 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5bdq\" (UniqueName: \"kubernetes.io/projected/bc8aa199-ad2f-4557-9430-ad968419174b-kube-api-access-m5bdq\") pod \"volume-data-source-validator-7c6cbb6c87-zqdt8\" (UID: \"bc8aa199-ad2f-4557-9430-ad968419174b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" Apr 24 21:26:59.615360 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.615360 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614895 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wld\" (UniqueName: \"kubernetes.io/projected/e1461942-917b-4737-86c2-fbe05a16beae-kube-api-access-c5wld\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.615360 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z47w2\" (UniqueName: \"kubernetes.io/projected/1b235dfc-3b04-476b-ac23-2c6473035a29-kube-api-access-z47w2\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.615360 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.616060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.614997 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.616060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.615959 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac4d827d-3c57-468f-bb34-d01bb87a171e-config-volume\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.616060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.615996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.616060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.616014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzd5\" (UniqueName: \"kubernetes.io/projected/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-kube-api-access-wzzd5\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.616338 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.616060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.616338 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.616112 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508d1a61-2ffa-465f-adcc-b555600791a5-config\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.616338 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.616153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-trusted-ca\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.617333 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.617241 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.619864 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.619019 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.622549 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.621092 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.622549 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.621535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.625394 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.625345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgxfm\" (UniqueName: \"kubernetes.io/projected/fd03e3ed-1908-42a1-8d28-3bdf4b8e27be-kube-api-access-fgxfm\") pod \"kube-storage-version-migrator-operator-6769c5d45-46xnr\" (UID: \"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.625721 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.625677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzd5\" (UniqueName: \"kubernetes.io/projected/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-kube-api-access-wzzd5\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:26:59.626200 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.626157 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87pb2\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.626313 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.626207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:26:59.717511 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717414 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wld\" (UniqueName: \"kubernetes.io/projected/e1461942-917b-4737-86c2-fbe05a16beae-kube-api-access-c5wld\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.717511 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717460 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z47w2\" (UniqueName: \"kubernetes.io/projected/1b235dfc-3b04-476b-ac23-2c6473035a29-kube-api-access-z47w2\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac4d827d-3c57-468f-bb34-d01bb87a171e-config-volume\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508d1a61-2ffa-465f-adcc-b555600791a5-config\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717722 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-trusted-ca\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717758 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnmh\" (UniqueName: \"kubernetes.io/projected/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-kube-api-access-lwnmh\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717785 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-stats-auth\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.717836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0157928b-5de1-4b95-b80f-c4ebf799bce3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717904 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-snapshots\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717945 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/508d1a61-2ffa-465f-adcc-b555600791a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.717972 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-service-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718004 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-config\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdbf\" (UniqueName: \"kubernetes.io/projected/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-kube-api-access-4rdbf\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.718219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e1461942-917b-4737-86c2-fbe05a16beae-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.718507 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508d1a61-2ffa-465f-adcc-b555600791a5-config\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.718507 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718428 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ac4d827d-3c57-468f-bb34-d01bb87a171e-config-volume\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.718507 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.718507 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.218484261 +0000 UTC m=+34.202699018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:26:59.718688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718611 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.718688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac4d827d-3c57-468f-bb34-d01bb87a171e-tmp-dir\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.718820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7ghz\" (UniqueName: \"kubernetes.io/projected/ac4d827d-3c57-468f-bb34-d01bb87a171e-kube-api-access-s7ghz\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.718820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knnr7\" (UniqueName: \"kubernetes.io/projected/17a3c1f7-f576-4683-9638-e6eab0e8ff34-kube-api-access-knnr7\") pod \"network-check-source-8894fc9bd-lks6f\" (UID: \"17a3c1f7-f576-4683-9638-e6eab0e8ff34\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" Apr 24 21:26:59.718820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718791 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.718820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-trusted-ca\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.718820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718817 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718838 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/e1461942-917b-4737-86c2-fbe05a16beae-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.718921 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718936 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-config\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.718951 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.718952 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64k7k\" (UniqueName: \"kubernetes.io/projected/48f5d05c-b88d-481a-b374-755d285e0f8f-kube-api-access-64k7k\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.718983 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.218969681 +0000 UTC m=+34.203184437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.719019 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.218996105 +0000 UTC m=+34.203210862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:26:59.719048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac4d827d-3c57-468f-bb34-d01bb87a171e-tmp-dir\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-default-certificate\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdps8\" (UniqueName: \"kubernetes.io/projected/508d1a61-2ffa-465f-adcc-b555600791a5-kube-api-access-xdps8\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719161 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-serving-cert\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f5d05c-b88d-481a-b374-755d285e0f8f-serving-cert\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.719222 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5bdq\" (UniqueName: \"kubernetes.io/projected/bc8aa199-ad2f-4557-9430-ad968419174b-kube-api-access-m5bdq\") pod \"volume-data-source-validator-7c6cbb6c87-zqdt8\" (UID: \"bc8aa199-ad2f-4557-9430-ad968419174b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719256 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.719288 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.219271832 +0000 UTC m=+34.203486595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:26:59.719443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.719337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-tmp\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.721565 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.721534 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/508d1a61-2ffa-465f-adcc-b555600791a5-serving-cert\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.721803 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.721775 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-stats-auth\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.721933 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.721909 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-default-certificate\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.723040 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.723020 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-serving-cert\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.728412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.728345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z47w2\" (UniqueName: \"kubernetes.io/projected/1b235dfc-3b04-476b-ac23-2c6473035a29-kube-api-access-z47w2\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:26:59.728821 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.728422 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wld\" (UniqueName: \"kubernetes.io/projected/e1461942-917b-4737-86c2-fbe05a16beae-kube-api-access-c5wld\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:26:59.728821 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.728426 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdbf\" (UniqueName: \"kubernetes.io/projected/6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd-kube-api-access-4rdbf\") pod \"console-operator-9d4b6777b-xngnk\" (UID: \"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd\") " pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.731101 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.731046 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5bdq\" (UniqueName: \"kubernetes.io/projected/bc8aa199-ad2f-4557-9430-ad968419174b-kube-api-access-m5bdq\") pod \"volume-data-source-validator-7c6cbb6c87-zqdt8\" (UID: \"bc8aa199-ad2f-4557-9430-ad968419174b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" Apr 24 21:26:59.731416 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.731395 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7ghz\" (UniqueName: \"kubernetes.io/projected/ac4d827d-3c57-468f-bb34-d01bb87a171e-kube-api-access-s7ghz\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:26:59.731476 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.731443 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdps8\" (UniqueName: \"kubernetes.io/projected/508d1a61-2ffa-465f-adcc-b555600791a5-kube-api-access-xdps8\") pod \"service-ca-operator-d6fc45fc5-cfgz5\" (UID: \"508d1a61-2ffa-465f-adcc-b555600791a5\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.749090 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.749051 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" Apr 24 21:26:59.792416 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.792380 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" Apr 24 21:26:59.819902 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.819861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnmh\" (UniqueName: \"kubernetes.io/projected/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-kube-api-access-lwnmh\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.820119 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.819926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0157928b-5de1-4b95-b80f-c4ebf799bce3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.820185 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-snapshots\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820185 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820171 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-service-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.820276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820258 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knnr7\" (UniqueName: \"kubernetes.io/projected/17a3c1f7-f576-4683-9638-e6eab0e8ff34-kube-api-access-knnr7\") pod \"network-check-source-8894fc9bd-lks6f\" (UID: \"17a3c1f7-f576-4683-9638-e6eab0e8ff34\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" Apr 24 21:26:59.820375 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820304 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820375 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820329 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64k7k\" (UniqueName: \"kubernetes.io/projected/48f5d05c-b88d-481a-b374-755d285e0f8f-kube-api-access-64k7k\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820461 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f5d05c-b88d-481a-b374-755d285e0f8f-serving-cert\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820461 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.820461 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820425 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-tmp\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.820608 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.820591 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:26:59.820692 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.820686 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.320644314 +0000 UTC m=+34.304859088 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:26:59.820756 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.820704 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:26:59.820756 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/0157928b-5de1-4b95-b80f-c4ebf799bce3-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:26:59.820756 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:26:59.820746 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:00.320735305 +0000 UTC m=+34.304950062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:26:59.821006 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.820973 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-snapshots\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.821142 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.821124 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48f5d05c-b88d-481a-b374-755d285e0f8f-tmp\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.821247 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.821229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-service-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.821797 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.821774 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48f5d05c-b88d-481a-b374-755d285e0f8f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.823619 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.823589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48f5d05c-b88d-481a-b374-755d285e0f8f-serving-cert\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.829910 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.829875 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnmh\" (UniqueName: \"kubernetes.io/projected/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-kube-api-access-lwnmh\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:26:59.831048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.831017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knnr7\" (UniqueName: \"kubernetes.io/projected/17a3c1f7-f576-4683-9638-e6eab0e8ff34-kube-api-access-knnr7\") pod \"network-check-source-8894fc9bd-lks6f\" (UID: \"17a3c1f7-f576-4683-9638-e6eab0e8ff34\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" Apr 24 21:26:59.831604 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.831581 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64k7k\" (UniqueName: \"kubernetes.io/projected/48f5d05c-b88d-481a-b374-755d285e0f8f-kube-api-access-64k7k\") pod \"insights-operator-585dfdc468-6rdb2\" (UID: \"48f5d05c-b88d-481a-b374-755d285e0f8f\") " pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.835416 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.835393 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" Apr 24 21:26:59.854541 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.854479 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:26:59.882818 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.882781 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" Apr 24 21:26:59.945579 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:26:59.945542 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" Apr 24 21:27:00.123911 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.123788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:00.124089 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.123908 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:00.124089 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.123929 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:27:00.124089 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.123936 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:00.124089 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.123998 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.123974876 +0000 UTC m=+35.108189656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:27:00.124089 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.124049 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:00.124339 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.124133 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.124116325 +0000 UTC m=+35.108331081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:27:00.224725 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.224681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:00.224918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.224753 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:00.224918 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.224781 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:00.224918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.224851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:00.224918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.224890 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:00.224918 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.224903 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.224940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.224979 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.224958606 +0000 UTC m=+35.209173369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.224984 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.225023 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.225041 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.225021729 +0000 UTC m=+35.209236493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.225065 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs podName:88be4377-88c5-417f-8cba-f0a7f6d5f16e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.225050546 +0000 UTC m=+66.209265322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs") pod "network-metrics-daemon-hf9r5" (UID: "88be4377-88c5-417f-8cba-f0a7f6d5f16e") : secret "metrics-daemon-secret" not found Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.225095 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.225083663 +0000 UTC m=+35.209298422 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:00.225161 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.225133 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.225109147 +0000 UTC m=+35.209323907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:27:00.326652 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.326607 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:00.326856 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.326699 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:27:00.326856 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.326753 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:00.326856 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.326837 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.326816928 +0000 UTC m=+35.311031688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:27:00.327020 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.326879 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:00.327075 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.327020 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:00.327127 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:00.327077 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:01.32706033 +0000 UTC m=+35.311275098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:27:00.330240 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.330218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc7wh\" (UniqueName: \"kubernetes.io/projected/7e98eeca-23eb-4e4c-b591-118f914a93a1-kube-api-access-fc7wh\") pod \"network-check-target-hzw5v\" (UID: \"7e98eeca-23eb-4e4c-b591-118f914a93a1\") " pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:27:00.520151 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.516339 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:27:00.640306 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.640266 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5"] Apr 24 21:27:00.663786 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.663762 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr"] Apr 24 21:27:00.666321 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.666294 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f"] Apr 24 21:27:00.669554 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.669514 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8"] Apr 24 21:27:00.670831 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.670807 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-6rdb2"] Apr 24 21:27:00.674157 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.674131 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xngnk"] Apr 24 21:27:00.710223 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.710182 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod508d1a61_2ffa_465f_adcc_b555600791a5.slice/crio-6cd72b548197d49b5a85110bacbbb68571657ce4bc111d5a4024f9e13b72c92c WatchSource:0}: Error finding container 6cd72b548197d49b5a85110bacbbb68571657ce4bc111d5a4024f9e13b72c92c: Status 404 returned error can't find the container with id 6cd72b548197d49b5a85110bacbbb68571657ce4bc111d5a4024f9e13b72c92c Apr 24 21:27:00.712314 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.712286 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hzw5v"] Apr 24 21:27:00.716157 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.716126 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd03e3ed_1908_42a1_8d28_3bdf4b8e27be.slice/crio-8d8b89e55c96789f90a8f09c07555376cf9bb59f8a22b6c45878fa3ab15bb8f6 WatchSource:0}: Error finding container 8d8b89e55c96789f90a8f09c07555376cf9bb59f8a22b6c45878fa3ab15bb8f6: Status 404 returned error can't find the container with id 8d8b89e55c96789f90a8f09c07555376cf9bb59f8a22b6c45878fa3ab15bb8f6 Apr 24 21:27:00.716411 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.716377 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17a3c1f7_f576_4683_9638_e6eab0e8ff34.slice/crio-06ae9af6ff379c5ea11405223a2cc95e0ecd66f068672bdd58aa86a07898d31b WatchSource:0}: Error finding container 06ae9af6ff379c5ea11405223a2cc95e0ecd66f068672bdd58aa86a07898d31b: Status 404 returned error can't find the container with id 06ae9af6ff379c5ea11405223a2cc95e0ecd66f068672bdd58aa86a07898d31b Apr 24 21:27:00.717819 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.717792 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f5d05c_b88d_481a_b374_755d285e0f8f.slice/crio-c1a925f4245900eae5d15a2ec05d04f896e46330909ef63dde2e23988eb4236a WatchSource:0}: Error finding container c1a925f4245900eae5d15a2ec05d04f896e46330909ef63dde2e23988eb4236a: Status 404 returned error can't find the container with id c1a925f4245900eae5d15a2ec05d04f896e46330909ef63dde2e23988eb4236a Apr 24 21:27:00.719110 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.719086 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc8aa199_ad2f_4557_9430_ad968419174b.slice/crio-c17f924b8e971319054745e11cb38e70a1788f5eea5d8c50a75e4e3612598c70 WatchSource:0}: Error finding container c17f924b8e971319054745e11cb38e70a1788f5eea5d8c50a75e4e3612598c70: Status 404 returned error can't find the container with id c17f924b8e971319054745e11cb38e70a1788f5eea5d8c50a75e4e3612598c70 Apr 24 21:27:00.720304 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.720178 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ad3b0df_323a_4ac2_bc1a_d5da7af6e8fd.slice/crio-026b4e0617226c78abd3330552c63dc9da0d76a338381503f1d34e5f329d38fa WatchSource:0}: Error finding container 026b4e0617226c78abd3330552c63dc9da0d76a338381503f1d34e5f329d38fa: Status 404 returned error can't find the container with id 026b4e0617226c78abd3330552c63dc9da0d76a338381503f1d34e5f329d38fa Apr 24 21:27:00.721963 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:00.721923 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e98eeca_23eb_4e4c_b591_118f914a93a1.slice/crio-cf5196b6a3d57e5929734d827f5bc4ac9ed178077fa8ae18aa2bc1cdd6da077f WatchSource:0}: Error finding container cf5196b6a3d57e5929734d827f5bc4ac9ed178077fa8ae18aa2bc1cdd6da077f: Status 404 returned error can't find the container with id cf5196b6a3d57e5929734d827f5bc4ac9ed178077fa8ae18aa2bc1cdd6da077f Apr 24 21:27:00.724242 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.724215 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" event={"ID":"48f5d05c-b88d-481a-b374-755d285e0f8f","Type":"ContainerStarted","Data":"c1a925f4245900eae5d15a2ec05d04f896e46330909ef63dde2e23988eb4236a"} Apr 24 21:27:00.725355 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.725332 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" event={"ID":"17a3c1f7-f576-4683-9638-e6eab0e8ff34","Type":"ContainerStarted","Data":"06ae9af6ff379c5ea11405223a2cc95e0ecd66f068672bdd58aa86a07898d31b"} Apr 24 21:27:00.726442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.726421 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" event={"ID":"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd","Type":"ContainerStarted","Data":"026b4e0617226c78abd3330552c63dc9da0d76a338381503f1d34e5f329d38fa"} Apr 24 21:27:00.727509 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.727488 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" event={"ID":"bc8aa199-ad2f-4557-9430-ad968419174b","Type":"ContainerStarted","Data":"c17f924b8e971319054745e11cb38e70a1788f5eea5d8c50a75e4e3612598c70"} Apr 24 21:27:00.728777 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.728753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" event={"ID":"508d1a61-2ffa-465f-adcc-b555600791a5","Type":"ContainerStarted","Data":"6cd72b548197d49b5a85110bacbbb68571657ce4bc111d5a4024f9e13b72c92c"} Apr 24 21:27:00.729783 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:00.729760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" event={"ID":"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be","Type":"ContainerStarted","Data":"8d8b89e55c96789f90a8f09c07555376cf9bb59f8a22b6c45878fa3ab15bb8f6"} Apr 24 21:27:01.136206 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.136164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:01.136423 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.136262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:01.136423 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.136320 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:01.136423 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.136417 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.136394582 +0000 UTC m=+37.120609355 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:27:01.136601 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.136443 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:01.136601 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.136465 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:27:01.136601 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.136526 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.13650989 +0000 UTC m=+37.120724659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:27:01.237907 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.237813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:01.237907 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.237883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:01.238134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.237941 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:01.238134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.237965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:01.238225 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238148 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.238134701 +0000 UTC m=+37.222349458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:01.238537 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238517 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:01.238618 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238573 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.238557059 +0000 UTC m=+37.222771820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:27:01.238715 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238625 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:01.238715 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238675 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.238644172 +0000 UTC m=+37.222858938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:27:01.238815 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238727 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:01.238815 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.238757 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.238747396 +0000 UTC m=+37.222962159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:01.339612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.339573 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:01.339810 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.339699 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:01.339859 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.339835 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:01.339900 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.339890 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.339870999 +0000 UTC m=+37.324085759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:27:01.340383 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.340272 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:01.340383 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:01.340348 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:03.340329433 +0000 UTC m=+37.324544204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:27:01.440442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.440379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:27:01.450613 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.450361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4fcf0e65-bdc1-401b-98e2-00ff3294162f-original-pull-secret\") pod \"global-pull-secret-syncer-vqqh6\" (UID: \"4fcf0e65-bdc1-401b-98e2-00ff3294162f\") " pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:27:01.734737 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.734612 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hzw5v" event={"ID":"7e98eeca-23eb-4e4c-b591-118f914a93a1","Type":"ContainerStarted","Data":"cf5196b6a3d57e5929734d827f5bc4ac9ed178077fa8ae18aa2bc1cdd6da077f"} Apr 24 21:27:01.740273 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.739837 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vqqh6" Apr 24 21:27:01.747093 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.746071 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="c3c976c236a5f9f16f3d045db8635cb6f86d6cfeefa8f3fa44c77816d1afe468" exitCode=0 Apr 24 21:27:01.747093 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:01.746122 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"c3c976c236a5f9f16f3d045db8635cb6f86d6cfeefa8f3fa44c77816d1afe468"} Apr 24 21:27:02.030928 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:02.030870 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vqqh6"] Apr 24 21:27:02.038839 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:02.038626 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fcf0e65_bdc1_401b_98e2_00ff3294162f.slice/crio-7a2668a9b8406949218532a6a588b77da9517f1e1f095b0ec24db09587d0e648 WatchSource:0}: Error finding container 7a2668a9b8406949218532a6a588b77da9517f1e1f095b0ec24db09587d0e648: Status 404 returned error can't find the container with id 7a2668a9b8406949218532a6a588b77da9517f1e1f095b0ec24db09587d0e648 Apr 24 21:27:02.760696 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:02.760598 2580 generic.go:358] "Generic (PLEG): container finished" podID="995ed227-fb30-4b70-9c48-e4516dc0a85c" containerID="35e341a3bb42032ddc17e236281d39dc20e3f5f2e701471d19fc402650a6509f" exitCode=0 Apr 24 21:27:02.760696 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:02.760688 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerDied","Data":"35e341a3bb42032ddc17e236281d39dc20e3f5f2e701471d19fc402650a6509f"} Apr 24 21:27:02.765230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:02.765191 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vqqh6" event={"ID":"4fcf0e65-bdc1-401b-98e2-00ff3294162f","Type":"ContainerStarted","Data":"7a2668a9b8406949218532a6a588b77da9517f1e1f095b0ec24db09587d0e648"} Apr 24 21:27:03.159041 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.159004 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:03.159239 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.159155 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:03.159319 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.159304 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:03.159366 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.159359 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.159344575 +0000 UTC m=+41.143559332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:27:03.159802 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.159778 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:03.159802 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.159802 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:27:03.159965 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.159862 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.159842013 +0000 UTC m=+41.144056786 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.260156 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.260222 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.260366 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.260404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.260545 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.260612 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.260591356 +0000 UTC m=+41.244806143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.261045 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.261101 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.261084155 +0000 UTC m=+41.245298924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.261171 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.261159117 +0000 UTC m=+41.245373874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.261231 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:03.261344 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.261263 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.261252585 +0000 UTC m=+41.245467344 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.361679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:03.361860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.362022 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.362089 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.362069225 +0000 UTC m=+41.346283985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.362481 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:03.362575 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:03.362537 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.362520905 +0000 UTC m=+41.346735662 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:27:07.201433 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.201393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.201493 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.201564 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.201625 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.201637 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.201622138 +0000 UTC m=+49.185836900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.201643 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:27:07.201904 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.201713 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.201694555 +0000 UTC m=+49.185909315 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:27:07.302415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.302368 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.302438 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.302483 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.302520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302517 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302611 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:07.302643 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302612 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:07.302978 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302619 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.302601558 +0000 UTC m=+49.286816320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:07.302978 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302721 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.302685243 +0000 UTC m=+49.286900000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:27:07.302978 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302734 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.30272755 +0000 UTC m=+49.286942306 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:27:07.302978 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.302751 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.302741748 +0000 UTC m=+49.286956505 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:07.403993 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.403950 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:07.404213 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:07.404067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:07.404213 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.404123 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:07.404213 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.404182 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:07.404213 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.404196 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.40417677 +0000 UTC m=+49.388391530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:27:07.404423 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:07.404229 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:15.404217248 +0000 UTC m=+49.388432005 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:27:10.791218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.791178 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" event={"ID":"995ed227-fb30-4b70-9c48-e4516dc0a85c","Type":"ContainerStarted","Data":"bd4b3ebca6e2c92b0a1cce41a0c5fce6be9ad25763f198bb23bb6d59b05aa669"} Apr 24 21:27:10.793159 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.793134 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vqqh6" event={"ID":"4fcf0e65-bdc1-401b-98e2-00ff3294162f","Type":"ContainerStarted","Data":"7618ea1f4a08ebe63c6dc5e46aabe45945cf782fef9de78ccefd6aabf559355f"} Apr 24 21:27:10.794988 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.794967 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" event={"ID":"48f5d05c-b88d-481a-b374-755d285e0f8f","Type":"ContainerStarted","Data":"16809ff2186ed7e1e34fb45ca68962bd330fc74f13b2b9d0f91b9849ea672445"} Apr 24 21:27:10.796468 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.796445 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" event={"ID":"17a3c1f7-f576-4683-9638-e6eab0e8ff34","Type":"ContainerStarted","Data":"332876f8191b6d3273c6f9a67aac6e136d1b9a6267ace6a774a5652683fdff85"} Apr 24 21:27:10.798083 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.798066 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/0.log" Apr 24 21:27:10.798154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.798101 2580 generic.go:358] "Generic (PLEG): container finished" podID="6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd" containerID="9f35cc0636c739bf7e3ddf194fa829d370e13be99673002a82141a2167603892" exitCode=255 Apr 24 21:27:10.798428 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.798247 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" event={"ID":"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd","Type":"ContainerDied","Data":"9f35cc0636c739bf7e3ddf194fa829d370e13be99673002a82141a2167603892"} Apr 24 21:27:10.798428 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.798407 2580 scope.go:117] "RemoveContainer" containerID="9f35cc0636c739bf7e3ddf194fa829d370e13be99673002a82141a2167603892" Apr 24 21:27:10.800439 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.800418 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hzw5v" event={"ID":"7e98eeca-23eb-4e4c-b591-118f914a93a1","Type":"ContainerStarted","Data":"bfbefb2391cdfccc5f397a959ffc8343e4474dafdaf8fac050596a6705f24461"} Apr 24 21:27:10.800893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.800817 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:27:10.802178 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.802156 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" event={"ID":"bc8aa199-ad2f-4557-9430-ad968419174b","Type":"ContainerStarted","Data":"c6c31b06a18e91c88d4a64835330218d1bddc2546ea1b09e098a27b9d02727d3"} Apr 24 21:27:10.804038 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.804018 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" event={"ID":"508d1a61-2ffa-465f-adcc-b555600791a5","Type":"ContainerStarted","Data":"e181dbec454ddcf8b94cbfe63e51f437c05e066c71e2a1ab391f413c8b66bd0d"} Apr 24 21:27:10.805712 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.805694 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" event={"ID":"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be","Type":"ContainerStarted","Data":"dc359d6b9c8db265d5e5c5e59e3704ac5b2fc3118b8136b17b1911d814db7527"} Apr 24 21:27:10.818991 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.818932 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tqs9r" podStartSLOduration=13.275320942 podStartE2EDuration="44.818913647s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:26:29.221370256 +0000 UTC m=+3.205585022" lastFinishedPulling="2026-04-24 21:27:00.764962955 +0000 UTC m=+34.749177727" observedRunningTime="2026-04-24 21:27:10.816143638 +0000 UTC m=+44.800358400" watchObservedRunningTime="2026-04-24 21:27:10.818913647 +0000 UTC m=+44.803128427" Apr 24 21:27:10.833773 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.833731 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" podStartSLOduration=32.492624848 podStartE2EDuration="41.83371688s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.717734902 +0000 UTC m=+34.701949673" lastFinishedPulling="2026-04-24 21:27:10.058826947 +0000 UTC m=+44.043041705" observedRunningTime="2026-04-24 21:27:10.833478694 +0000 UTC m=+44.817693476" watchObservedRunningTime="2026-04-24 21:27:10.83371688 +0000 UTC m=+44.817931659" Apr 24 21:27:10.861710 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.861598 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zqdt8" podStartSLOduration=31.641173421 podStartE2EDuration="40.861579685s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.740653637 +0000 UTC m=+34.724868400" lastFinishedPulling="2026-04-24 21:27:09.961059901 +0000 UTC m=+43.945274664" observedRunningTime="2026-04-24 21:27:10.860745336 +0000 UTC m=+44.844960114" watchObservedRunningTime="2026-04-24 21:27:10.861579685 +0000 UTC m=+44.845794465" Apr 24 21:27:10.885395 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.885326 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hzw5v" podStartSLOduration=35.551568958 podStartE2EDuration="44.885296921s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.740648193 +0000 UTC m=+34.724862956" lastFinishedPulling="2026-04-24 21:27:10.074376152 +0000 UTC m=+44.058590919" observedRunningTime="2026-04-24 21:27:10.883770126 +0000 UTC m=+44.867984910" watchObservedRunningTime="2026-04-24 21:27:10.885296921 +0000 UTC m=+44.869511701" Apr 24 21:27:10.923580 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:10.923521 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" podStartSLOduration=32.577388648 podStartE2EDuration="41.923501715s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.712587989 +0000 UTC m=+34.696802753" lastFinishedPulling="2026-04-24 21:27:10.058701061 +0000 UTC m=+44.042915820" observedRunningTime="2026-04-24 21:27:10.922844846 +0000 UTC m=+44.907059628" watchObservedRunningTime="2026-04-24 21:27:10.923501715 +0000 UTC m=+44.907716495" Apr 24 21:27:11.026301 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.026239 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" podStartSLOduration=31.70696899 podStartE2EDuration="41.026218044s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.740671921 +0000 UTC m=+34.724886694" lastFinishedPulling="2026-04-24 21:27:10.059920987 +0000 UTC m=+44.044135748" observedRunningTime="2026-04-24 21:27:10.998154909 +0000 UTC m=+44.982369687" watchObservedRunningTime="2026-04-24 21:27:11.026218044 +0000 UTC m=+45.010432826" Apr 24 21:27:11.026473 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.026405 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-lks6f" podStartSLOduration=31.686929418 podStartE2EDuration="41.026396121s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.719231028 +0000 UTC m=+34.703445786" lastFinishedPulling="2026-04-24 21:27:10.058697709 +0000 UTC m=+44.042912489" observedRunningTime="2026-04-24 21:27:11.026334734 +0000 UTC m=+45.010549503" watchObservedRunningTime="2026-04-24 21:27:11.026396121 +0000 UTC m=+45.010610903" Apr 24 21:27:11.053892 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.053782 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vqqh6" podStartSLOduration=34.043309243 podStartE2EDuration="42.05375982s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:02.047710732 +0000 UTC m=+36.031925495" lastFinishedPulling="2026-04-24 21:27:10.0581613 +0000 UTC m=+44.042376072" observedRunningTime="2026-04-24 21:27:11.053262218 +0000 UTC m=+45.037477012" watchObservedRunningTime="2026-04-24 21:27:11.05375982 +0000 UTC m=+45.037974604" Apr 24 21:27:11.810960 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.810929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:27:11.811372 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.811336 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/0.log" Apr 24 21:27:11.811429 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.811370 2580 generic.go:358] "Generic (PLEG): container finished" podID="6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd" containerID="d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1" exitCode=255 Apr 24 21:27:11.811707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.811516 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" event={"ID":"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd","Type":"ContainerDied","Data":"d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1"} Apr 24 21:27:11.811707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.811562 2580 scope.go:117] "RemoveContainer" containerID="9f35cc0636c739bf7e3ddf194fa829d370e13be99673002a82141a2167603892" Apr 24 21:27:11.811891 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:11.811786 2580 scope.go:117] "RemoveContainer" containerID="d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1" Apr 24 21:27:11.811994 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:11.811976 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xngnk_openshift-console-operator(6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" podUID="6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd" Apr 24 21:27:12.816942 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:12.816913 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:27:12.817421 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:12.817244 2580 scope.go:117] "RemoveContainer" containerID="d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1" Apr 24 21:27:12.817493 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:12.817439 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xngnk_openshift-console-operator(6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" podUID="6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd" Apr 24 21:27:13.250455 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:13.250428 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kp4z8_8d58b512-cb70-43b1-ac41-1be3111f0ccc/dns-node-resolver/0.log" Apr 24 21:27:14.451270 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:14.451240 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tcjn9_6fcc9f57-5d75-40c0-88a6-9f4985a693ad/node-ca/0.log" Apr 24 21:27:15.282373 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.282335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:15.282553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.282434 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:15.282553 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.282488 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:15.282553 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.282550 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls podName:f21b4cf7-af90-4e90-a786-e7f271ec6fcc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.282535615 +0000 UTC m=+65.266750372 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7597v" (UID: "f21b4cf7-af90-4e90-a786-e7f271ec6fcc") : secret "samples-operator-tls" not found Apr 24 21:27:15.282676 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.282573 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:15.282676 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.282590 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7c59b84694-bjv4k: secret "image-registry-tls" not found Apr 24 21:27:15.282676 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.282634 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls podName:0e205ba8-7366-427a-9381-12562fbe3d12 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.28262138 +0000 UTC m=+65.266836140 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls") pod "image-registry-7c59b84694-bjv4k" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12") : secret "image-registry-tls" not found Apr 24 21:27:15.383681 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.383617 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:15.383681 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.383681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383756 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.383778 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.383805 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383826 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls podName:e1461942-917b-4737-86c2-fbe05a16beae nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.383805349 +0000 UTC m=+65.368020107 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pb7ns" (UID: "e1461942-917b-4737-86c2-fbe05a16beae") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383866 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:15.383888 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383874 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:15.384063 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383870 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.383851066 +0000 UTC m=+65.368065832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:15.384063 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383932 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs podName:1b235dfc-3b04-476b-ac23-2c6473035a29 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.38392375 +0000 UTC m=+65.368138508 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs") pod "router-default-66f8f496b4-4gdhz" (UID: "1b235dfc-3b04-476b-ac23-2c6473035a29") : secret "router-metrics-certs-default" not found Apr 24 21:27:15.384063 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.383943 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls podName:ac4d827d-3c57-468f-bb34-d01bb87a171e nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.383937001 +0000 UTC m=+65.368151769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls") pod "dns-default-lfbbd" (UID: "ac4d827d-3c57-468f-bb34-d01bb87a171e") : secret "dns-default-metrics-tls" not found Apr 24 21:27:15.484460 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.484419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:15.484897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:15.484545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:15.484897 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.484573 2580 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:15.484897 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.484638 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert podName:0157928b-5de1-4b95-b80f-c4ebf799bce3 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.484622365 +0000 UTC m=+65.468837122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-8bbnh" (UID: "0157928b-5de1-4b95-b80f-c4ebf799bce3") : secret "networking-console-plugin-cert" not found Apr 24 21:27:15.484897 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.484701 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:15.484897 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:15.484754 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert podName:01ac23ff-5ced-4c6d-b63f-bd951a6746ec nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.484741463 +0000 UTC m=+65.468956224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert") pod "ingress-canary-7mllt" (UID: "01ac23ff-5ced-4c6d-b63f-bd951a6746ec") : secret "canary-serving-cert" not found Apr 24 21:27:19.854674 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:19.854619 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:27:19.854674 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:19.854695 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:27:19.855108 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:19.855062 2580 scope.go:117] "RemoveContainer" containerID="d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1" Apr 24 21:27:19.855251 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:27:19.855233 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xngnk_openshift-console-operator(6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" podUID="6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd" Apr 24 21:27:27.616207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:27.616177 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwmjf" Apr 24 21:27:31.323343 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.323302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:31.323760 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.323373 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:31.326209 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.326173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f21b4cf7-af90-4e90-a786-e7f271ec6fcc-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7597v\" (UID: \"f21b4cf7-af90-4e90-a786-e7f271ec6fcc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:31.326331 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.326252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"image-registry-7c59b84694-bjv4k\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:31.424477 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.424437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:31.424691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.424527 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:31.424691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.424551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:31.424691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.424586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:31.425202 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.425169 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b235dfc-3b04-476b-ac23-2c6473035a29-service-ca-bundle\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:31.427128 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.427101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ac4d827d-3c57-468f-bb34-d01bb87a171e-metrics-tls\") pod \"dns-default-lfbbd\" (UID: \"ac4d827d-3c57-468f-bb34-d01bb87a171e\") " pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:31.427236 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.427217 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1461942-917b-4737-86c2-fbe05a16beae-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pb7ns\" (UID: \"e1461942-917b-4737-86c2-fbe05a16beae\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:31.427283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.427218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1b235dfc-3b04-476b-ac23-2c6473035a29-metrics-certs\") pod \"router-default-66f8f496b4-4gdhz\" (UID: \"1b235dfc-3b04-476b-ac23-2c6473035a29\") " pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:31.525618 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.525580 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:31.525815 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.525638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:31.528545 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.528513 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01ac23ff-5ced-4c6d-b63f-bd951a6746ec-cert\") pod \"ingress-canary-7mllt\" (UID: \"01ac23ff-5ced-4c6d-b63f-bd951a6746ec\") " pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:31.528545 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.528535 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0157928b-5de1-4b95-b80f-c4ebf799bce3-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-8bbnh\" (UID: \"0157928b-5de1-4b95-b80f-c4ebf799bce3\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:31.565679 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.565618 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qbk7c\"" Apr 24 21:27:31.571174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.571151 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-skpc5\"" Apr 24 21:27:31.573237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.573218 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:31.579990 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.579957 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" Apr 24 21:27:31.599522 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.599493 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-sx8rz\"" Apr 24 21:27:31.607782 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.607753 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:31.620673 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.620477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bjmj7\"" Apr 24 21:27:31.628945 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.628872 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" Apr 24 21:27:31.651131 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.650866 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kfvbj\"" Apr 24 21:27:31.656647 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.656500 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:31.673735 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.673117 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qc8h5\"" Apr 24 21:27:31.680523 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.680481 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" Apr 24 21:27:31.703505 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.701505 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fgjf8\"" Apr 24 21:27:31.712908 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.709259 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7mllt" Apr 24 21:27:31.758811 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.757268 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:27:31.789159 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.788857 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v"] Apr 24 21:27:31.830221 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.828258 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-66f8f496b4-4gdhz"] Apr 24 21:27:31.834092 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:31.834048 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b235dfc_3b04_476b_ac23_2c6473035a29.slice/crio-a237000e3a473eae9455627d89a3781c63d7dc7e78f27297fc189b9bf4e89fdb WatchSource:0}: Error finding container a237000e3a473eae9455627d89a3781c63d7dc7e78f27297fc189b9bf4e89fdb: Status 404 returned error can't find the container with id a237000e3a473eae9455627d89a3781c63d7dc7e78f27297fc189b9bf4e89fdb Apr 24 21:27:31.868587 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.868337 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns"] Apr 24 21:27:31.871092 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:31.870991 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1461942_917b_4737_86c2_fbe05a16beae.slice/crio-7a5145941239289258730ffc57ded1b9a52ff9d0f6834981720b45425c9c1727 WatchSource:0}: Error finding container 7a5145941239289258730ffc57ded1b9a52ff9d0f6834981720b45425c9c1727: Status 404 returned error can't find the container with id 7a5145941239289258730ffc57ded1b9a52ff9d0f6834981720b45425c9c1727 Apr 24 21:27:31.872435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.872405 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" event={"ID":"1b235dfc-3b04-476b-ac23-2c6473035a29","Type":"ContainerStarted","Data":"a237000e3a473eae9455627d89a3781c63d7dc7e78f27297fc189b9bf4e89fdb"} Apr 24 21:27:31.874529 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.874497 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" event={"ID":"0e205ba8-7366-427a-9381-12562fbe3d12","Type":"ContainerStarted","Data":"b415b92233397dda2c0361b60c0eb7817eec128bbe56b63acdd14f64c5e44a94"} Apr 24 21:27:31.882207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.882184 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lfbbd"] Apr 24 21:27:31.885688 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:31.885633 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac4d827d_3c57_468f_bb34_d01bb87a171e.slice/crio-909f5f39db14511fd05327370dd8f84eedb87f98207975f68aeeff22037b27be WatchSource:0}: Error finding container 909f5f39db14511fd05327370dd8f84eedb87f98207975f68aeeff22037b27be: Status 404 returned error can't find the container with id 909f5f39db14511fd05327370dd8f84eedb87f98207975f68aeeff22037b27be Apr 24 21:27:31.902622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.902560 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh"] Apr 24 21:27:31.911757 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:31.911730 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0157928b_5de1_4b95_b80f_c4ebf799bce3.slice/crio-92529c981889247e94ce1ed9e225b01416e6cc2046f599a028f3567516f6e773 WatchSource:0}: Error finding container 92529c981889247e94ce1ed9e225b01416e6cc2046f599a028f3567516f6e773: Status 404 returned error can't find the container with id 92529c981889247e94ce1ed9e225b01416e6cc2046f599a028f3567516f6e773 Apr 24 21:27:31.914893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:31.914862 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7mllt"] Apr 24 21:27:31.923397 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:31.923370 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ac23ff_5ced_4c6d_b63f_bd951a6746ec.slice/crio-916239ae6f42186aacc4f7e5e82efa07847d246a1d59a55d79304d40741009e5 WatchSource:0}: Error finding container 916239ae6f42186aacc4f7e5e82efa07847d246a1d59a55d79304d40741009e5: Status 404 returned error can't find the container with id 916239ae6f42186aacc4f7e5e82efa07847d246a1d59a55d79304d40741009e5 Apr 24 21:27:32.234072 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.234033 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:27:32.236465 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.236442 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88be4377-88c5-417f-8cba-f0a7f6d5f16e-metrics-certs\") pod \"network-metrics-daemon-hf9r5\" (UID: \"88be4377-88c5-417f-8cba-f0a7f6d5f16e\") " pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:27:32.309212 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.309178 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vd7zr\"" Apr 24 21:27:32.317045 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.317023 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hf9r5" Apr 24 21:27:32.446986 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.446951 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hf9r5"] Apr 24 21:27:32.883417 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.882591 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" event={"ID":"1b235dfc-3b04-476b-ac23-2c6473035a29","Type":"ContainerStarted","Data":"cf958a9056b0fe107b17c54ba9a2cc8f4de64bad5356a12b77404cee9d687f84"} Apr 24 21:27:32.888435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.888359 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" event={"ID":"0e205ba8-7366-427a-9381-12562fbe3d12","Type":"ContainerStarted","Data":"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229"} Apr 24 21:27:32.888435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.888402 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:32.890271 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.890220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" event={"ID":"f21b4cf7-af90-4e90-a786-e7f271ec6fcc","Type":"ContainerStarted","Data":"509d0425bbe0956e9d2ed70568d941fec2004410c711fa200e2ef9612ad4f763"} Apr 24 21:27:32.893124 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.892884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hf9r5" event={"ID":"88be4377-88c5-417f-8cba-f0a7f6d5f16e","Type":"ContainerStarted","Data":"8dc01e01d5c97db24a170788b88f1877b215bc6c6bbdc88758d2de4461b29dea"} Apr 24 21:27:32.894843 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.894807 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7mllt" event={"ID":"01ac23ff-5ced-4c6d-b63f-bd951a6746ec","Type":"ContainerStarted","Data":"916239ae6f42186aacc4f7e5e82efa07847d246a1d59a55d79304d40741009e5"} Apr 24 21:27:32.896246 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.896224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" event={"ID":"0157928b-5de1-4b95-b80f-c4ebf799bce3","Type":"ContainerStarted","Data":"92529c981889247e94ce1ed9e225b01416e6cc2046f599a028f3567516f6e773"} Apr 24 21:27:32.898564 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.898537 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lfbbd" event={"ID":"ac4d827d-3c57-468f-bb34-d01bb87a171e","Type":"ContainerStarted","Data":"909f5f39db14511fd05327370dd8f84eedb87f98207975f68aeeff22037b27be"} Apr 24 21:27:32.900304 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.900275 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" event={"ID":"e1461942-917b-4737-86c2-fbe05a16beae","Type":"ContainerStarted","Data":"7a5145941239289258730ffc57ded1b9a52ff9d0f6834981720b45425c9c1727"} Apr 24 21:27:32.930917 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:32.930857 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" podStartSLOduration=63.930837315 podStartE2EDuration="1m3.930837315s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:32.905407461 +0000 UTC m=+66.889622240" watchObservedRunningTime="2026-04-24 21:27:32.930837315 +0000 UTC m=+66.915052101" Apr 24 21:27:33.608145 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:33.608109 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:33.611115 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:33.611062 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:33.633368 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:33.633319 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" podStartSLOduration=66.633302187 podStartE2EDuration="1m6.633302187s" podCreationTimestamp="2026-04-24 21:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:32.93073458 +0000 UTC m=+66.914949359" watchObservedRunningTime="2026-04-24 21:27:33.633302187 +0000 UTC m=+67.617516963" Apr 24 21:27:33.905179 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:33.905140 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:33.906588 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:33.906564 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-66f8f496b4-4gdhz" Apr 24 21:27:34.421399 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.421360 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294"] Apr 24 21:27:34.455251 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.455212 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294"] Apr 24 21:27:34.455432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.455360 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.458594 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.458565 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:34.458816 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.458791 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:27:34.458941 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.458882 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:27:34.459082 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.459065 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:34.459275 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.459251 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:27:34.459406 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.459273 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:27:34.459406 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.459287 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:34.555030 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.554993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thvrn\" (UniqueName: \"kubernetes.io/projected/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-kube-api-access-thvrn\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.555228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.555066 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.555228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.555099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.555228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.555130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.555228 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.555162 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.555432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.555233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.559140 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.559109 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jjpnr"] Apr 24 21:27:34.580792 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.580754 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjpnr"] Apr 24 21:27:34.580971 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.580911 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.581472 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.581449 2580 scope.go:117] "RemoveContainer" containerID="d94795a96dbaa1701b7f8584f1a5734872ff15367c972198a09cfa6d337dccc1" Apr 24 21:27:34.583561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.583540 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:27:34.583722 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.583705 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-85klr\"" Apr 24 21:27:34.583950 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.583934 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:27:34.656667 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656711 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656742 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6bed2968-018b-485a-8c19-169e6c4ebbb5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656838 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thvrn\" (UniqueName: \"kubernetes.io/projected/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-kube-api-access-thvrn\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.656965 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6bed2968-018b-485a-8c19-169e6c4ebbb5-data-volume\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.657003 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.657040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.657077 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.657065 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n48c\" (UniqueName: \"kubernetes.io/projected/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-api-access-6n48c\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.657565 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.657232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6bed2968-018b-485a-8c19-169e6c4ebbb5-crio-socket\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.660011 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.659979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.660232 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.660209 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.660520 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.660500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.660761 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.660738 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-hub\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.667195 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.667165 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.680208 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.680130 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thvrn\" (UniqueName: \"kubernetes.io/projected/2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f-kube-api-access-thvrn\") pod \"cluster-proxy-proxy-agent-7c9f5cdbc4-8b294\" (UID: \"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.758726 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.758690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6bed2968-018b-485a-8c19-169e6c4ebbb5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.758922 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.758762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.758922 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.758796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6bed2968-018b-485a-8c19-169e6c4ebbb5-data-volume\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.758922 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.758840 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n48c\" (UniqueName: \"kubernetes.io/projected/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-api-access-6n48c\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.759261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.759234 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6bed2968-018b-485a-8c19-169e6c4ebbb5-crio-socket\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.759343 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.759322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.759383 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.759343 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6bed2968-018b-485a-8c19-169e6c4ebbb5-crio-socket\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.759383 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.759352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6bed2968-018b-485a-8c19-169e6c4ebbb5-data-volume\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.761421 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.761394 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6bed2968-018b-485a-8c19-169e6c4ebbb5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.774447 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.774414 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" Apr 24 21:27:34.774807 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.774772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n48c\" (UniqueName: \"kubernetes.io/projected/6bed2968-018b-485a-8c19-169e6c4ebbb5-kube-api-access-6n48c\") pod \"insights-runtime-extractor-jjpnr\" (UID: \"6bed2968-018b-485a-8c19-169e6c4ebbb5\") " pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:34.892442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:34.892408 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjpnr" Apr 24 21:27:37.504871 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.503911 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjpnr"] Apr 24 21:27:37.521738 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:37.521651 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bed2968_018b_485a_8c19_169e6c4ebbb5.slice/crio-d72c0864f03c53589521809a1d2d0721068aae9b14ed23521b639ed87cdc168a WatchSource:0}: Error finding container d72c0864f03c53589521809a1d2d0721068aae9b14ed23521b639ed87cdc168a: Status 404 returned error can't find the container with id d72c0864f03c53589521809a1d2d0721068aae9b14ed23521b639ed87cdc168a Apr 24 21:27:37.536278 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.534473 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294"] Apr 24 21:27:37.570708 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:37.569028 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c2c6eb7_0bda_41c3_a69a_b3ffefbc674f.slice/crio-49b220a64f964e933b184ec4c6828ecf1778ff92a7780e60fd6102e8a9870df8 WatchSource:0}: Error finding container 49b220a64f964e933b184ec4c6828ecf1778ff92a7780e60fd6102e8a9870df8: Status 404 returned error can't find the container with id 49b220a64f964e933b184ec4c6828ecf1778ff92a7780e60fd6102e8a9870df8 Apr 24 21:27:37.924402 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.924356 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" event={"ID":"0157928b-5de1-4b95-b80f-c4ebf799bce3","Type":"ContainerStarted","Data":"7a5edec69cb8b0d31bf0346107634392ce63a005d6dca90d0a591674a380ff7f"} Apr 24 21:27:37.927513 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.926873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lfbbd" event={"ID":"ac4d827d-3c57-468f-bb34-d01bb87a171e","Type":"ContainerStarted","Data":"278d4d1759c94050274bfe1ce8ebe56511e21fd097e272aad20aadeb842f3c3f"} Apr 24 21:27:37.927513 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.927462 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:37.927513 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.927482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lfbbd" event={"ID":"ac4d827d-3c57-468f-bb34-d01bb87a171e","Type":"ContainerStarted","Data":"4050e70db70bb4021812542cebc2230f5c272765d6e987b7f69da5a26ad0240f"} Apr 24 21:27:37.928980 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.928931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" event={"ID":"e1461942-917b-4737-86c2-fbe05a16beae","Type":"ContainerStarted","Data":"c4cc7d7dfb68adf70a0fc952eb9b94c626cda1ec1c894b39484d66b33d952b75"} Apr 24 21:27:37.931006 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.930978 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7mllt" event={"ID":"01ac23ff-5ced-4c6d-b63f-bd951a6746ec","Type":"ContainerStarted","Data":"938f18c1ed6a751c2d7f2b83634cf74bc73326ef008fa2e7fc1810c2f33652e7"} Apr 24 21:27:37.932961 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.932916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" event={"ID":"f21b4cf7-af90-4e90-a786-e7f271ec6fcc","Type":"ContainerStarted","Data":"6c6e53c041fe61caecd704d45bd81c4d97c57697f4b546054f292224bf639407"} Apr 24 21:27:37.932961 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.932941 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" event={"ID":"f21b4cf7-af90-4e90-a786-e7f271ec6fcc","Type":"ContainerStarted","Data":"4e853ab788b3cad42b5574ba24216a63972e20385e473be8675b6c8457c33a36"} Apr 24 21:27:37.935303 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.935266 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hf9r5" event={"ID":"88be4377-88c5-417f-8cba-f0a7f6d5f16e","Type":"ContainerStarted","Data":"5d132011f518aa9308b3b9c4d98af1d071d3d09bd0400665b20fb156273712fc"} Apr 24 21:27:37.935303 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.935288 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hf9r5" event={"ID":"88be4377-88c5-417f-8cba-f0a7f6d5f16e","Type":"ContainerStarted","Data":"9ee330f9111008aaa847c8263a583760a990e20b0d45bf7e9791e8cbe507bdec"} Apr 24 21:27:37.937289 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.937267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:27:37.937348 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.937322 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" event={"ID":"6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd","Type":"ContainerStarted","Data":"d43d104b4b33fc07c8387baa7616ded9f02bc6c41e4fe4c85aef5b85cf9a81c1"} Apr 24 21:27:37.937827 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.937808 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:27:37.939494 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.939471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjpnr" event={"ID":"6bed2968-018b-485a-8c19-169e6c4ebbb5","Type":"ContainerStarted","Data":"d7724737bc9d2bdfe1116baeb485d1bd3310fb016d5d855d009da5d8e060b7da"} Apr 24 21:27:37.939636 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.939616 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjpnr" event={"ID":"6bed2968-018b-485a-8c19-169e6c4ebbb5","Type":"ContainerStarted","Data":"d72c0864f03c53589521809a1d2d0721068aae9b14ed23521b639ed87cdc168a"} Apr 24 21:27:37.940533 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.940516 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" event={"ID":"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f","Type":"ContainerStarted","Data":"49b220a64f964e933b184ec4c6828ecf1778ff92a7780e60fd6102e8a9870df8"} Apr 24 21:27:37.941752 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.941702 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-8bbnh" podStartSLOduration=57.53075932 podStartE2EDuration="1m2.941692212s" podCreationTimestamp="2026-04-24 21:26:35 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.915275029 +0000 UTC m=+65.899489786" lastFinishedPulling="2026-04-24 21:27:37.326207916 +0000 UTC m=+71.310422678" observedRunningTime="2026-04-24 21:27:37.941316517 +0000 UTC m=+71.925531296" watchObservedRunningTime="2026-04-24 21:27:37.941692212 +0000 UTC m=+71.925906991" Apr 24 21:27:37.965323 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.965235 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pb7ns" podStartSLOduration=63.508941612 podStartE2EDuration="1m8.965214736s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.873113914 +0000 UTC m=+65.857328672" lastFinishedPulling="2026-04-24 21:27:37.329387024 +0000 UTC m=+71.313601796" observedRunningTime="2026-04-24 21:27:37.963325175 +0000 UTC m=+71.947539954" watchObservedRunningTime="2026-04-24 21:27:37.965214736 +0000 UTC m=+71.949429516" Apr 24 21:27:37.987192 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:37.987143 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lfbbd" podStartSLOduration=33.546966596 podStartE2EDuration="38.987127403s" podCreationTimestamp="2026-04-24 21:26:59 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.888575835 +0000 UTC m=+65.872790594" lastFinishedPulling="2026-04-24 21:27:37.328736631 +0000 UTC m=+71.312951401" observedRunningTime="2026-04-24 21:27:37.986185761 +0000 UTC m=+71.970400539" watchObservedRunningTime="2026-04-24 21:27:37.987127403 +0000 UTC m=+71.971342182" Apr 24 21:27:38.021489 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:38.021376 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" podStartSLOduration=58.731049342 podStartE2EDuration="1m8.021356672s" podCreationTimestamp="2026-04-24 21:26:30 +0000 UTC" firstStartedPulling="2026-04-24 21:27:00.74064708 +0000 UTC m=+34.724861844" lastFinishedPulling="2026-04-24 21:27:10.030954403 +0000 UTC m=+44.015169174" observedRunningTime="2026-04-24 21:27:38.020712371 +0000 UTC m=+72.004927150" watchObservedRunningTime="2026-04-24 21:27:38.021356672 +0000 UTC m=+72.005571450" Apr 24 21:27:38.042132 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:38.042066 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hf9r5" podStartSLOduration=67.167289259 podStartE2EDuration="1m12.042051141s" podCreationTimestamp="2026-04-24 21:26:26 +0000 UTC" firstStartedPulling="2026-04-24 21:27:32.454004102 +0000 UTC m=+66.438218859" lastFinishedPulling="2026-04-24 21:27:37.328765976 +0000 UTC m=+71.312980741" observedRunningTime="2026-04-24 21:27:38.041046402 +0000 UTC m=+72.025261209" watchObservedRunningTime="2026-04-24 21:27:38.042051141 +0000 UTC m=+72.026265939" Apr 24 21:27:38.064841 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:38.064780 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7597v" podStartSLOduration=63.617307822 podStartE2EDuration="1m9.064767921s" podCreationTimestamp="2026-04-24 21:26:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.878759215 +0000 UTC m=+65.862973976" lastFinishedPulling="2026-04-24 21:27:37.326219313 +0000 UTC m=+71.310434075" observedRunningTime="2026-04-24 21:27:38.06310537 +0000 UTC m=+72.047320150" watchObservedRunningTime="2026-04-24 21:27:38.064767921 +0000 UTC m=+72.048982699" Apr 24 21:27:38.082269 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:38.082215 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7mllt" podStartSLOduration=33.681607341 podStartE2EDuration="39.082197008s" podCreationTimestamp="2026-04-24 21:26:59 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.925616674 +0000 UTC m=+65.909831432" lastFinishedPulling="2026-04-24 21:27:37.32620633 +0000 UTC m=+71.310421099" observedRunningTime="2026-04-24 21:27:38.08059861 +0000 UTC m=+72.064813390" watchObservedRunningTime="2026-04-24 21:27:38.082197008 +0000 UTC m=+72.066411788" Apr 24 21:27:38.460033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:38.459985 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-xngnk" Apr 24 21:27:39.952585 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:39.952534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjpnr" event={"ID":"6bed2968-018b-485a-8c19-169e6c4ebbb5","Type":"ContainerStarted","Data":"77ef2dde064269a946701d26b4705934e60c2a6be175cc40d5dc5dd387664256"} Apr 24 21:27:41.959320 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:41.959281 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" event={"ID":"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f","Type":"ContainerStarted","Data":"b173b635e047151eb96cde59f10cbde89b377fdce473a7a41322ab801a17eda4"} Apr 24 21:27:42.819547 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:42.819513 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hzw5v" Apr 24 21:27:42.964632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:42.964592 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjpnr" event={"ID":"6bed2968-018b-485a-8c19-169e6c4ebbb5","Type":"ContainerStarted","Data":"091b815134500e2e1e94c0115991608943d54b406a810c288f03a1e513ca76b7"} Apr 24 21:27:42.985360 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:42.984986 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jjpnr" podStartSLOduration=4.377511761 podStartE2EDuration="8.984968543s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.68830176 +0000 UTC m=+71.672516522" lastFinishedPulling="2026-04-24 21:27:42.295758545 +0000 UTC m=+76.279973304" observedRunningTime="2026-04-24 21:27:42.983393809 +0000 UTC m=+76.967608587" watchObservedRunningTime="2026-04-24 21:27:42.984968543 +0000 UTC m=+76.969183712" Apr 24 21:27:43.969840 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:43.969796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" event={"ID":"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f","Type":"ContainerStarted","Data":"8e9999ed3ee4db0d566e5a210dda72617a5744a3391ebd4c5419ab0a90d97b73"} Apr 24 21:27:43.969840 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:43.969845 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" event={"ID":"2c2c6eb7-0bda-41c3-a69a-b3ffefbc674f","Type":"ContainerStarted","Data":"024ccd7ab53773b5a5bffb1e59184a7b81401f700ab1a12e84394576ba4b51fc"} Apr 24 21:27:43.989621 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:43.989567 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c9f5cdbc4-8b294" podStartSLOduration=4.084651493 podStartE2EDuration="9.989551786s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.57886726 +0000 UTC m=+71.563082020" lastFinishedPulling="2026-04-24 21:27:43.483767549 +0000 UTC m=+77.467982313" observedRunningTime="2026-04-24 21:27:43.987903196 +0000 UTC m=+77.972117976" watchObservedRunningTime="2026-04-24 21:27:43.989551786 +0000 UTC m=+77.973766592" Apr 24 21:27:47.693553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.693519 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7mgnq"] Apr 24 21:27:47.698430 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.698404 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.700845 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.700824 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:27:47.701268 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.701246 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lx6q7\"" Apr 24 21:27:47.701363 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.701250 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:27:47.701856 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.701839 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:27:47.702014 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.701998 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:27:47.879768 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-root\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879768 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879773 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-wtmp\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879985 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-textfile\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879985 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879985 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879896 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-sys\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879985 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqg9\" (UniqueName: \"kubernetes.io/projected/b7c71279-6058-4d71-996a-5f54db9b0320-kube-api-access-xcqg9\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.879985 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-metrics-client-ca\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.880149 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.879996 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-accelerators-collector-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.880149 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.880052 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-tls\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.945874 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.945788 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lfbbd" Apr 24 21:27:47.980574 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-textfile\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980641 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-sys\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqg9\" (UniqueName: \"kubernetes.io/projected/b7c71279-6058-4d71-996a-5f54db9b0320-kube-api-access-xcqg9\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-metrics-client-ca\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980740 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-accelerators-collector-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.980785 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-tls\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.981121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-root\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.981121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-wtmp\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.981121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.980917 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-textfile\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.981121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.981023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-wtmp\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.981453 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.981436 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-metrics-client-ca\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.982223 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.982187 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-accelerators-collector-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.982361 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.982347 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-root\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.982433 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.982421 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7c71279-6058-4d71-996a-5f54db9b0320-sys\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.984247 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.984215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.984731 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.984715 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7c71279-6058-4d71-996a-5f54db9b0320-node-exporter-tls\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:47.990008 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:47.989977 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqg9\" (UniqueName: \"kubernetes.io/projected/b7c71279-6058-4d71-996a-5f54db9b0320-kube-api-access-xcqg9\") pod \"node-exporter-7mgnq\" (UID: \"b7c71279-6058-4d71-996a-5f54db9b0320\") " pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:48.009911 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:48.009878 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7mgnq" Apr 24 21:27:48.019555 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:27:48.019525 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c71279_6058_4d71_996a_5f54db9b0320.slice/crio-7b4cee92baf4f070ecfc890edc23a0091e678bde771e1a1e6f6058fbac132f68 WatchSource:0}: Error finding container 7b4cee92baf4f070ecfc890edc23a0091e678bde771e1a1e6f6058fbac132f68: Status 404 returned error can't find the container with id 7b4cee92baf4f070ecfc890edc23a0091e678bde771e1a1e6f6058fbac132f68 Apr 24 21:27:48.988686 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:48.988628 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7mgnq" event={"ID":"b7c71279-6058-4d71-996a-5f54db9b0320","Type":"ContainerStarted","Data":"b5d78edf7dd06f3748b5dbc6b39c4118e80e46d469752e62d95378372b7c3e52"} Apr 24 21:27:48.989123 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:48.988701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7mgnq" event={"ID":"b7c71279-6058-4d71-996a-5f54db9b0320","Type":"ContainerStarted","Data":"7b4cee92baf4f070ecfc890edc23a0091e678bde771e1a1e6f6058fbac132f68"} Apr 24 21:27:49.993354 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:49.993316 2580 generic.go:358] "Generic (PLEG): container finished" podID="b7c71279-6058-4d71-996a-5f54db9b0320" containerID="b5d78edf7dd06f3748b5dbc6b39c4118e80e46d469752e62d95378372b7c3e52" exitCode=0 Apr 24 21:27:49.993771 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:49.993390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7mgnq" event={"ID":"b7c71279-6058-4d71-996a-5f54db9b0320","Type":"ContainerDied","Data":"b5d78edf7dd06f3748b5dbc6b39c4118e80e46d469752e62d95378372b7c3e52"} Apr 24 21:27:50.998582 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:50.998546 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7mgnq" event={"ID":"b7c71279-6058-4d71-996a-5f54db9b0320","Type":"ContainerStarted","Data":"1008eb424ec89b1f99097143b4affecb097aa91bdb5bc565c23e37a9c4ccab5b"} Apr 24 21:27:50.998582 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:50.998588 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7mgnq" event={"ID":"b7c71279-6058-4d71-996a-5f54db9b0320","Type":"ContainerStarted","Data":"4b14236f9221520a84b43668ec050624b2c0d86cb4ce2414f311a0dcc9fac915"} Apr 24 21:27:51.021714 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:51.021645 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7mgnq" podStartSLOduration=3.212070888 podStartE2EDuration="4.021630122s" podCreationTimestamp="2026-04-24 21:27:47 +0000 UTC" firstStartedPulling="2026-04-24 21:27:48.021459458 +0000 UTC m=+82.005674218" lastFinishedPulling="2026-04-24 21:27:48.831018691 +0000 UTC m=+82.815233452" observedRunningTime="2026-04-24 21:27:51.019783665 +0000 UTC m=+85.003998444" watchObservedRunningTime="2026-04-24 21:27:51.021630122 +0000 UTC m=+85.005844901" Apr 24 21:27:51.578071 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:51.578033 2580 patch_prober.go:28] interesting pod/image-registry-7c59b84694-bjv4k container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:27:51.578282 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:51.578096 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:27:53.909009 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:53.908978 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:27:56.716080 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:27:56.716044 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:28:21.740689 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:21.740632 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" containerName="registry" containerID="cri-o://d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229" gracePeriod=30 Apr 24 21:28:21.988640 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:21.988612 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:28:22.056932 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.056839 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.056932 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.056905 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.056944 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.056999 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87pb2\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057034 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057056 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057085 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057117 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates\") pod \"0e205ba8-7366-427a-9381-12562fbe3d12\" (UID: \"0e205ba8-7366-427a-9381-12562fbe3d12\") " Apr 24 21:28:22.057677 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057631 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:22.058018 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.057970 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:22.059978 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.059951 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2" (OuterVolumeSpecName: "kube-api-access-87pb2") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "kube-api-access-87pb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:22.060095 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.059976 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:22.060095 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.059990 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:22.060095 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.060058 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:22.060266 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.060180 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:22.065339 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.065312 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0e205ba8-7366-427a-9381-12562fbe3d12" (UID: "0e205ba8-7366-427a-9381-12562fbe3d12"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:22.104538 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.104503 2580 generic.go:358] "Generic (PLEG): container finished" podID="0e205ba8-7366-427a-9381-12562fbe3d12" containerID="d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229" exitCode=0 Apr 24 21:28:22.104707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.104560 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" event={"ID":"0e205ba8-7366-427a-9381-12562fbe3d12","Type":"ContainerDied","Data":"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229"} Apr 24 21:28:22.104707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.104590 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" event={"ID":"0e205ba8-7366-427a-9381-12562fbe3d12","Type":"ContainerDied","Data":"b415b92233397dda2c0361b60c0eb7817eec128bbe56b63acdd14f64c5e44a94"} Apr 24 21:28:22.104707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.104604 2580 scope.go:117] "RemoveContainer" containerID="d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229" Apr 24 21:28:22.104707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.104565 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c59b84694-bjv4k" Apr 24 21:28:22.113338 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.113313 2580 scope.go:117] "RemoveContainer" containerID="d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229" Apr 24 21:28:22.113610 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:28:22.113588 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229\": container with ID starting with d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229 not found: ID does not exist" containerID="d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229" Apr 24 21:28:22.113717 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.113618 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229"} err="failed to get container status \"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229\": rpc error: code = NotFound desc = could not find container \"d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229\": container with ID starting with d1a3c2d14c865822372f4fff07fe5561617f1143301c7e9d8fa8c4856d5d0229 not found: ID does not exist" Apr 24 21:28:22.126629 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.126605 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:28:22.131059 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.131036 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7c59b84694-bjv4k"] Apr 24 21:28:22.158033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.157997 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-87pb2\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-kube-api-access-87pb2\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158031 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-trusted-ca\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158041 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-bound-sa-token\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158049 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-installation-pull-secrets\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158059 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e205ba8-7366-427a-9381-12562fbe3d12-registry-certificates\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158068 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e205ba8-7366-427a-9381-12562fbe3d12-registry-tls\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158077 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e205ba8-7366-427a-9381-12562fbe3d12-image-registry-private-configuration\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.158248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.158086 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e205ba8-7366-427a-9381-12562fbe3d12-ca-trust-extracted\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:28:22.577539 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:22.577507 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" path="/var/lib/kubelet/pods/0e205ba8-7366-427a-9381-12562fbe3d12/volumes" Apr 24 21:28:28.124536 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:28.124503 2580 generic.go:358] "Generic (PLEG): container finished" podID="fd03e3ed-1908-42a1-8d28-3bdf4b8e27be" containerID="dc359d6b9c8db265d5e5c5e59e3704ac5b2fc3118b8136b17b1911d814db7527" exitCode=0 Apr 24 21:28:28.124987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:28.124543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" event={"ID":"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be","Type":"ContainerDied","Data":"dc359d6b9c8db265d5e5c5e59e3704ac5b2fc3118b8136b17b1911d814db7527"} Apr 24 21:28:28.124987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:28.124872 2580 scope.go:117] "RemoveContainer" containerID="dc359d6b9c8db265d5e5c5e59e3704ac5b2fc3118b8136b17b1911d814db7527" Apr 24 21:28:29.129108 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:29.129071 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-46xnr" event={"ID":"fd03e3ed-1908-42a1-8d28-3bdf4b8e27be","Type":"ContainerStarted","Data":"1ca4f36b6b410285ac1cdb5d983b9941feea9a7ea4f0e6e03496d733262f1bf2"} Apr 24 21:28:41.164755 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:41.164713 2580 generic.go:358] "Generic (PLEG): container finished" podID="508d1a61-2ffa-465f-adcc-b555600791a5" containerID="e181dbec454ddcf8b94cbfe63e51f437c05e066c71e2a1ab391f413c8b66bd0d" exitCode=0 Apr 24 21:28:41.165121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:41.164794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" event={"ID":"508d1a61-2ffa-465f-adcc-b555600791a5","Type":"ContainerDied","Data":"e181dbec454ddcf8b94cbfe63e51f437c05e066c71e2a1ab391f413c8b66bd0d"} Apr 24 21:28:41.165157 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:41.165145 2580 scope.go:117] "RemoveContainer" containerID="e181dbec454ddcf8b94cbfe63e51f437c05e066c71e2a1ab391f413c8b66bd0d" Apr 24 21:28:42.169091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:42.169057 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-cfgz5" event={"ID":"508d1a61-2ffa-465f-adcc-b555600791a5","Type":"ContainerStarted","Data":"c147b550236f129f8cd6c292016c79640a88b668c213b0851d79bb7a1e7dbf21"} Apr 24 21:28:47.184089 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:47.184049 2580 generic.go:358] "Generic (PLEG): container finished" podID="48f5d05c-b88d-481a-b374-755d285e0f8f" containerID="16809ff2186ed7e1e34fb45ca68962bd330fc74f13b2b9d0f91b9849ea672445" exitCode=0 Apr 24 21:28:47.184458 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:47.184124 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" event={"ID":"48f5d05c-b88d-481a-b374-755d285e0f8f","Type":"ContainerDied","Data":"16809ff2186ed7e1e34fb45ca68962bd330fc74f13b2b9d0f91b9849ea672445"} Apr 24 21:28:47.184498 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:47.184463 2580 scope.go:117] "RemoveContainer" containerID="16809ff2186ed7e1e34fb45ca68962bd330fc74f13b2b9d0f91b9849ea672445" Apr 24 21:28:48.189387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:28:48.189355 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-6rdb2" event={"ID":"48f5d05c-b88d-481a-b374-755d285e0f8f","Type":"ContainerStarted","Data":"2998925da03e7624c21fc5d6cec5d28d734b9a1d3494561f2fe5e60d873d2571"} Apr 24 21:31:13.056595 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.056548 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-p7rjr"] Apr 24 21:31:13.057177 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.056959 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" containerName="registry" Apr 24 21:31:13.057177 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.056977 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" containerName="registry" Apr 24 21:31:13.057177 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.057073 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e205ba8-7366-427a-9381-12562fbe3d12" containerName="registry" Apr 24 21:31:13.059896 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.059875 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.064237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.064209 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:31:13.064745 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.064721 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:31:13.065052 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.065023 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n2vl9\"" Apr 24 21:31:13.065052 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.065047 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:31:13.065210 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.065183 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:31:13.077437 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.077412 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-p7rjr"] Apr 24 21:31:13.206233 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.206193 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.206436 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.206242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhsx\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-kube-api-access-srhsx\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.307064 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.306971 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.307064 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.307018 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srhsx\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-kube-api-access-srhsx\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.307243 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:31:13.307123 2580 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:31:13.307243 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:31:13.307151 2580 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-p7rjr: secret "keda-admission-webhooks-certs" not found Apr 24 21:31:13.307243 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:31:13.307218 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates podName:30a9fe46-a1c6-4cbe-9fe9-60e673f9e818 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:13.807201187 +0000 UTC m=+287.791415943 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates") pod "keda-admission-cf49989db-p7rjr" (UID: "30a9fe46-a1c6-4cbe-9fe9-60e673f9e818") : secret "keda-admission-webhooks-certs" not found Apr 24 21:31:13.319894 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.319858 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhsx\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-kube-api-access-srhsx\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.811946 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.811905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.814491 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.814461 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30a9fe46-a1c6-4cbe-9fe9-60e673f9e818-certificates\") pod \"keda-admission-cf49989db-p7rjr\" (UID: \"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818\") " pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:13.974622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:13.974562 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:14.126496 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:14.126414 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-p7rjr"] Apr 24 21:31:14.133565 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:31:14.133535 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30a9fe46_a1c6_4cbe_9fe9_60e673f9e818.slice/crio-2c1cf1e87e6d862c64d13d86dc32fc13c338ad799bf5f452edb26bec02d69423 WatchSource:0}: Error finding container 2c1cf1e87e6d862c64d13d86dc32fc13c338ad799bf5f452edb26bec02d69423: Status 404 returned error can't find the container with id 2c1cf1e87e6d862c64d13d86dc32fc13c338ad799bf5f452edb26bec02d69423 Apr 24 21:31:14.588309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:14.588267 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-p7rjr" event={"ID":"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818","Type":"ContainerStarted","Data":"2c1cf1e87e6d862c64d13d86dc32fc13c338ad799bf5f452edb26bec02d69423"} Apr 24 21:31:17.598153 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:17.598116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-p7rjr" event={"ID":"30a9fe46-a1c6-4cbe-9fe9-60e673f9e818","Type":"ContainerStarted","Data":"9dce12300b9553371e126dfcd0663382d59ee0266b1a49fed5cfe71d7b5db0c3"} Apr 24 21:31:17.598556 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:17.598170 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:31:17.626574 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:17.626514 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-p7rjr" podStartSLOduration=2.161916742 podStartE2EDuration="4.62649548s" podCreationTimestamp="2026-04-24 21:31:13 +0000 UTC" firstStartedPulling="2026-04-24 21:31:14.134819067 +0000 UTC m=+288.119033823" lastFinishedPulling="2026-04-24 21:31:16.599397801 +0000 UTC m=+290.583612561" observedRunningTime="2026-04-24 21:31:17.625033027 +0000 UTC m=+291.609247816" watchObservedRunningTime="2026-04-24 21:31:17.62649548 +0000 UTC m=+291.610710260" Apr 24 21:31:26.466267 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:26.466229 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:31:26.466810 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:26.466791 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:31:26.474405 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:26.474380 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:31:38.603583 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:31:38.603543 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-p7rjr" Apr 24 21:32:40.846310 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.846270 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74"] Apr 24 21:32:40.849651 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.849633 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:40.852145 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.852106 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:32:40.852299 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.852247 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 21:32:40.852379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.852353 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mqccc\"" Apr 24 21:32:40.859107 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.859079 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74"] Apr 24 21:32:40.898052 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.898004 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5n4r\" (UniqueName: \"kubernetes.io/projected/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-kube-api-access-r5n4r\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:40.898239 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.898081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:40.998593 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.998553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:40.998788 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.998637 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5n4r\" (UniqueName: \"kubernetes.io/projected/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-kube-api-access-r5n4r\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:40.998974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:40.998951 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:41.012900 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:41.012867 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5n4r\" (UniqueName: \"kubernetes.io/projected/c0fc86b5-b817-4040-b25b-4a48aaabf1d2-kube-api-access-r5n4r\") pod \"openshift-lws-operator-bfc7f696d-4bw74\" (UID: \"c0fc86b5-b817-4040-b25b-4a48aaabf1d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:41.159924 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:41.159887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" Apr 24 21:32:41.288466 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:41.288264 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74"] Apr 24 21:32:41.291459 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:32:41.291427 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0fc86b5_b817_4040_b25b_4a48aaabf1d2.slice/crio-53c07674bd09a4fb4c93633eb521d3182de09c02ab40e67e551cee8241a300be WatchSource:0}: Error finding container 53c07674bd09a4fb4c93633eb521d3182de09c02ab40e67e551cee8241a300be: Status 404 returned error can't find the container with id 53c07674bd09a4fb4c93633eb521d3182de09c02ab40e67e551cee8241a300be Apr 24 21:32:41.292911 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:41.292890 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:41.831517 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:41.831481 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" event={"ID":"c0fc86b5-b817-4040-b25b-4a48aaabf1d2","Type":"ContainerStarted","Data":"53c07674bd09a4fb4c93633eb521d3182de09c02ab40e67e551cee8241a300be"} Apr 24 21:32:43.838708 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:43.838648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" event={"ID":"c0fc86b5-b817-4040-b25b-4a48aaabf1d2","Type":"ContainerStarted","Data":"2243afe4e8e9f34b7bdea3dac1cf2a31ec702dcd88ab8f0aac56b6df4ca41cbc"} Apr 24 21:32:43.858400 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:32:43.858332 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-4bw74" podStartSLOduration=1.394468041 podStartE2EDuration="3.85831533s" podCreationTimestamp="2026-04-24 21:32:40 +0000 UTC" firstStartedPulling="2026-04-24 21:32:41.29302599 +0000 UTC m=+375.277240747" lastFinishedPulling="2026-04-24 21:32:43.756873276 +0000 UTC m=+377.741088036" observedRunningTime="2026-04-24 21:32:43.857390113 +0000 UTC m=+377.841604894" watchObservedRunningTime="2026-04-24 21:32:43.85831533 +0000 UTC m=+377.842530110" Apr 24 21:33:04.984259 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.984225 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz"] Apr 24 21:33:04.987718 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.987689 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:04.992289 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.992258 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 21:33:04.992425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.992292 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 21:33:04.992425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.992258 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vrpjh\"" Apr 24 21:33:04.992425 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:04.992258 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 21:33:05.005281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.005251 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz"] Apr 24 21:33:05.108361 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.108321 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ml8\" (UniqueName: \"kubernetes.io/projected/2a9f2315-f28a-4d4b-959e-97cdc222b75c-kube-api-access-j9ml8\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.108553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.108370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-metrics-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.108553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.108396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.108553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.108482 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a9f2315-f28a-4d4b-959e-97cdc222b75c-manager-config\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.209796 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.209762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ml8\" (UniqueName: \"kubernetes.io/projected/2a9f2315-f28a-4d4b-959e-97cdc222b75c-kube-api-access-j9ml8\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.210009 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.209804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-metrics-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.210009 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.209828 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.210009 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.209858 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a9f2315-f28a-4d4b-959e-97cdc222b75c-manager-config\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.210432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.210403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a9f2315-f28a-4d4b-959e-97cdc222b75c-manager-config\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.212586 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.212563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.212692 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.212619 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a9f2315-f28a-4d4b-959e-97cdc222b75c-metrics-cert\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.219033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.219011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ml8\" (UniqueName: \"kubernetes.io/projected/2a9f2315-f28a-4d4b-959e-97cdc222b75c-kube-api-access-j9ml8\") pod \"lws-controller-manager-798cfdf8b-9vksz\" (UID: \"2a9f2315-f28a-4d4b-959e-97cdc222b75c\") " pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.296778 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.296680 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:05.427398 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.427342 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz"] Apr 24 21:33:05.433576 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:33:05.433543 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9f2315_f28a_4d4b_959e_97cdc222b75c.slice/crio-dc0ff0061c340bd93a4e3a91b8cf24688ae5b77a5f5b8443fb128320c8f1c3d3 WatchSource:0}: Error finding container dc0ff0061c340bd93a4e3a91b8cf24688ae5b77a5f5b8443fb128320c8f1c3d3: Status 404 returned error can't find the container with id dc0ff0061c340bd93a4e3a91b8cf24688ae5b77a5f5b8443fb128320c8f1c3d3 Apr 24 21:33:05.902834 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:05.902793 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" event={"ID":"2a9f2315-f28a-4d4b-959e-97cdc222b75c","Type":"ContainerStarted","Data":"dc0ff0061c340bd93a4e3a91b8cf24688ae5b77a5f5b8443fb128320c8f1c3d3"} Apr 24 21:33:07.910735 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:07.910698 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" event={"ID":"2a9f2315-f28a-4d4b-959e-97cdc222b75c","Type":"ContainerStarted","Data":"d05bee9eeda0af342c55b45f085b00087578c80609f859650d498dedbff2b81d"} Apr 24 21:33:07.911135 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:07.910812 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:33:07.941063 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:07.941009 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" podStartSLOduration=2.5211458369999997 podStartE2EDuration="3.94099364s" podCreationTimestamp="2026-04-24 21:33:04 +0000 UTC" firstStartedPulling="2026-04-24 21:33:05.435540049 +0000 UTC m=+399.419754805" lastFinishedPulling="2026-04-24 21:33:06.855387847 +0000 UTC m=+400.839602608" observedRunningTime="2026-04-24 21:33:07.939482919 +0000 UTC m=+401.923697711" watchObservedRunningTime="2026-04-24 21:33:07.94099364 +0000 UTC m=+401.925208419" Apr 24 21:33:18.915571 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:33:18.915538 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-798cfdf8b-9vksz" Apr 24 21:34:19.152485 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.152448 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp"] Apr 24 21:34:19.156113 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.156092 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.158572 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.158544 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:34:19.158724 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.158592 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:34:19.158724 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.158558 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 21:34:19.159496 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.159471 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ps4w2\"" Apr 24 21:34:19.159641 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.159472 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 21:34:19.163928 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.163885 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp"] Apr 24 21:34:19.307713 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.307649 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jrsz\" (UniqueName: \"kubernetes.io/projected/be5848e5-3de3-438f-b807-9f00b1969cd4-kube-api-access-6jrsz\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.307891 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.307754 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/be5848e5-3de3-438f-b807-9f00b1969cd4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.307891 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.307801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.408786 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.408686 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/be5848e5-3de3-438f-b807-9f00b1969cd4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.408786 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.408738 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.408973 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.408833 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jrsz\" (UniqueName: \"kubernetes.io/projected/be5848e5-3de3-438f-b807-9f00b1969cd4-kube-api-access-6jrsz\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.408973 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:34:19.408928 2580 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 24 21:34:19.409041 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:34:19.409013 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert podName:be5848e5-3de3-438f-b807-9f00b1969cd4 nodeName:}" failed. No retries permitted until 2026-04-24 21:34:19.908991998 +0000 UTC m=+473.893206773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-7zfmp" (UID: "be5848e5-3de3-438f-b807-9f00b1969cd4") : secret "plugin-serving-cert" not found Apr 24 21:34:19.409421 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.409402 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/be5848e5-3de3-438f-b807-9f00b1969cd4-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.425645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.425608 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jrsz\" (UniqueName: \"kubernetes.io/projected/be5848e5-3de3-438f-b807-9f00b1969cd4-kube-api-access-6jrsz\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.913176 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.913122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:19.915908 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:19.915877 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/be5848e5-3de3-438f-b807-9f00b1969cd4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-7zfmp\" (UID: \"be5848e5-3de3-438f-b807-9f00b1969cd4\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:20.068970 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:20.068930 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" Apr 24 21:34:20.196202 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:20.196175 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp"] Apr 24 21:34:20.198088 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:34:20.198059 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5848e5_3de3_438f_b807_9f00b1969cd4.slice/crio-5a3346b250d696f5292f43511e49a29d7def9eadb54930660467fc12eb0dc703 WatchSource:0}: Error finding container 5a3346b250d696f5292f43511e49a29d7def9eadb54930660467fc12eb0dc703: Status 404 returned error can't find the container with id 5a3346b250d696f5292f43511e49a29d7def9eadb54930660467fc12eb0dc703 Apr 24 21:34:21.116078 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:21.116039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" event={"ID":"be5848e5-3de3-438f-b807-9f00b1969cd4","Type":"ContainerStarted","Data":"5a3346b250d696f5292f43511e49a29d7def9eadb54930660467fc12eb0dc703"} Apr 24 21:34:26.138976 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:26.138934 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" event={"ID":"be5848e5-3de3-438f-b807-9f00b1969cd4","Type":"ContainerStarted","Data":"1ff87841ad19b12950f0af6bbe34a5a68834c3b1de9fc06cdb237185b14b3400"} Apr 24 21:34:26.159806 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:34:26.159746 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-7zfmp" podStartSLOduration=2.207665367 podStartE2EDuration="7.159726963s" podCreationTimestamp="2026-04-24 21:34:19 +0000 UTC" firstStartedPulling="2026-04-24 21:34:20.199686814 +0000 UTC m=+474.183901588" lastFinishedPulling="2026-04-24 21:34:25.151748417 +0000 UTC m=+479.135963184" observedRunningTime="2026-04-24 21:34:26.15905211 +0000 UTC m=+480.143266889" watchObservedRunningTime="2026-04-24 21:34:26.159726963 +0000 UTC m=+480.143941743" Apr 24 21:35:03.790018 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.789978 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:03.814700 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.814647 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:03.814884 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.814788 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:03.817503 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.817476 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-276xr\"" Apr 24 21:35:03.859313 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.859275 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvh8\" (UniqueName: \"kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8\") pod \"authorino-674b59b84c-hx7qg\" (UID: \"b91c6380-9915-4053-a21a-c77d3e204766\") " pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:03.917484 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.917445 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:03.920624 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.920608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:03.927069 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.927039 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:03.959760 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.959715 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr7xv\" (UniqueName: \"kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv\") pod \"authorino-79cbc94b89-4phn2\" (UID: \"84d1886c-72cb-4428-abfe-2902290a84e5\") " pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:03.959913 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.959799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvh8\" (UniqueName: \"kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8\") pod \"authorino-674b59b84c-hx7qg\" (UID: \"b91c6380-9915-4053-a21a-c77d3e204766\") " pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:03.972272 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:03.972242 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvh8\" (UniqueName: \"kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8\") pod \"authorino-674b59b84c-hx7qg\" (UID: \"b91c6380-9915-4053-a21a-c77d3e204766\") " pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:04.061223 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.061125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr7xv\" (UniqueName: \"kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv\") pod \"authorino-79cbc94b89-4phn2\" (UID: \"84d1886c-72cb-4428-abfe-2902290a84e5\") " pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:04.069558 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.069522 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr7xv\" (UniqueName: \"kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv\") pod \"authorino-79cbc94b89-4phn2\" (UID: \"84d1886c-72cb-4428-abfe-2902290a84e5\") " pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:04.124039 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.123977 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:04.231017 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.230983 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:04.254427 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.254399 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:04.257272 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:35:04.257243 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb91c6380_9915_4053_a21a_c77d3e204766.slice/crio-507b98bad94c088d4aa09d21d9a7adc09666be2876035520541ea27dc8e79aad WatchSource:0}: Error finding container 507b98bad94c088d4aa09d21d9a7adc09666be2876035520541ea27dc8e79aad: Status 404 returned error can't find the container with id 507b98bad94c088d4aa09d21d9a7adc09666be2876035520541ea27dc8e79aad Apr 24 21:35:04.355570 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:04.355487 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:04.358255 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:35:04.358228 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d1886c_72cb_4428_abfe_2902290a84e5.slice/crio-47ae453acfb55486c081bf0c53ddb66a8f60f821ab45dfc0a873074d83e05573 WatchSource:0}: Error finding container 47ae453acfb55486c081bf0c53ddb66a8f60f821ab45dfc0a873074d83e05573: Status 404 returned error can't find the container with id 47ae453acfb55486c081bf0c53ddb66a8f60f821ab45dfc0a873074d83e05573 Apr 24 21:35:05.260456 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:05.260408 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-hx7qg" event={"ID":"b91c6380-9915-4053-a21a-c77d3e204766","Type":"ContainerStarted","Data":"507b98bad94c088d4aa09d21d9a7adc09666be2876035520541ea27dc8e79aad"} Apr 24 21:35:05.262205 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:05.262169 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4phn2" event={"ID":"84d1886c-72cb-4428-abfe-2902290a84e5","Type":"ContainerStarted","Data":"47ae453acfb55486c081bf0c53ddb66a8f60f821ab45dfc0a873074d83e05573"} Apr 24 21:35:07.270598 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:07.270559 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4phn2" event={"ID":"84d1886c-72cb-4428-abfe-2902290a84e5","Type":"ContainerStarted","Data":"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752"} Apr 24 21:35:07.272040 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:07.272014 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-hx7qg" event={"ID":"b91c6380-9915-4053-a21a-c77d3e204766","Type":"ContainerStarted","Data":"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff"} Apr 24 21:35:07.288876 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:07.288774 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-4phn2" podStartSLOduration=1.601631781 podStartE2EDuration="4.288756031s" podCreationTimestamp="2026-04-24 21:35:03 +0000 UTC" firstStartedPulling="2026-04-24 21:35:04.359614912 +0000 UTC m=+518.343829672" lastFinishedPulling="2026-04-24 21:35:07.046739159 +0000 UTC m=+521.030953922" observedRunningTime="2026-04-24 21:35:07.287295307 +0000 UTC m=+521.271510086" watchObservedRunningTime="2026-04-24 21:35:07.288756031 +0000 UTC m=+521.272970837" Apr 24 21:35:07.309956 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:07.309890 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-hx7qg" podStartSLOduration=1.5103298 podStartE2EDuration="4.309866161s" podCreationTimestamp="2026-04-24 21:35:03 +0000 UTC" firstStartedPulling="2026-04-24 21:35:04.258962662 +0000 UTC m=+518.243177419" lastFinishedPulling="2026-04-24 21:35:07.058499014 +0000 UTC m=+521.042713780" observedRunningTime="2026-04-24 21:35:07.309455578 +0000 UTC m=+521.293670358" watchObservedRunningTime="2026-04-24 21:35:07.309866161 +0000 UTC m=+521.294080944" Apr 24 21:35:07.321748 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:07.321707 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:09.278309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:09.278268 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-hx7qg" podUID="b91c6380-9915-4053-a21a-c77d3e204766" containerName="authorino" containerID="cri-o://9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff" gracePeriod=30 Apr 24 21:35:09.517498 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:09.517471 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:09.604830 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:09.604732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvh8\" (UniqueName: \"kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8\") pod \"b91c6380-9915-4053-a21a-c77d3e204766\" (UID: \"b91c6380-9915-4053-a21a-c77d3e204766\") " Apr 24 21:35:09.607127 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:09.607090 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8" (OuterVolumeSpecName: "kube-api-access-4pvh8") pod "b91c6380-9915-4053-a21a-c77d3e204766" (UID: "b91c6380-9915-4053-a21a-c77d3e204766"). InnerVolumeSpecName "kube-api-access-4pvh8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:09.705897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:09.705859 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4pvh8\" (UniqueName: \"kubernetes.io/projected/b91c6380-9915-4053-a21a-c77d3e204766-kube-api-access-4pvh8\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:35:10.282174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.282142 2580 generic.go:358] "Generic (PLEG): container finished" podID="b91c6380-9915-4053-a21a-c77d3e204766" containerID="9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff" exitCode=0 Apr 24 21:35:10.282622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.282190 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-hx7qg" Apr 24 21:35:10.282622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.282228 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-hx7qg" event={"ID":"b91c6380-9915-4053-a21a-c77d3e204766","Type":"ContainerDied","Data":"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff"} Apr 24 21:35:10.282622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.282267 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-hx7qg" event={"ID":"b91c6380-9915-4053-a21a-c77d3e204766","Type":"ContainerDied","Data":"507b98bad94c088d4aa09d21d9a7adc09666be2876035520541ea27dc8e79aad"} Apr 24 21:35:10.282622 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.282285 2580 scope.go:117] "RemoveContainer" containerID="9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff" Apr 24 21:35:10.295394 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.295367 2580 scope.go:117] "RemoveContainer" containerID="9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff" Apr 24 21:35:10.295740 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:35:10.295716 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff\": container with ID starting with 9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff not found: ID does not exist" containerID="9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff" Apr 24 21:35:10.295828 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.295763 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff"} err="failed to get container status \"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff\": rpc error: code = NotFound desc = could not find container \"9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff\": container with ID starting with 9a68c9f00c69f745e23b44d54e491197a05f83635d5e2d507e114e9e468ab3ff not found: ID does not exist" Apr 24 21:35:10.310019 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.309985 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:10.311540 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.311516 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-hx7qg"] Apr 24 21:35:10.578218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:10.578134 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b91c6380-9915-4053-a21a-c77d3e204766" path="/var/lib/kubelet/pods/b91c6380-9915-4053-a21a-c77d3e204766/volumes" Apr 24 21:35:28.206496 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.206457 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-rdwcv"] Apr 24 21:35:28.206885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.206761 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b91c6380-9915-4053-a21a-c77d3e204766" containerName="authorino" Apr 24 21:35:28.206885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.206772 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b91c6380-9915-4053-a21a-c77d3e204766" containerName="authorino" Apr 24 21:35:28.206885 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.206834 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b91c6380-9915-4053-a21a-c77d3e204766" containerName="authorino" Apr 24 21:35:28.210625 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.210607 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.212866 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.212838 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 24 21:35:28.217189 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.217162 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-rdwcv"] Apr 24 21:35:28.362060 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.362026 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e680c157-9498-4ee6-ae79-941f7eb8f874-tls-cert\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.362257 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.362115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pnfd\" (UniqueName: \"kubernetes.io/projected/e680c157-9498-4ee6-ae79-941f7eb8f874-kube-api-access-2pnfd\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.463351 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.463259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pnfd\" (UniqueName: \"kubernetes.io/projected/e680c157-9498-4ee6-ae79-941f7eb8f874-kube-api-access-2pnfd\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.463351 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.463324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e680c157-9498-4ee6-ae79-941f7eb8f874-tls-cert\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.465906 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.465876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e680c157-9498-4ee6-ae79-941f7eb8f874-tls-cert\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.471226 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.471200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pnfd\" (UniqueName: \"kubernetes.io/projected/e680c157-9498-4ee6-ae79-941f7eb8f874-kube-api-access-2pnfd\") pod \"authorino-68bd676465-rdwcv\" (UID: \"e680c157-9498-4ee6-ae79-941f7eb8f874\") " pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.520249 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.520211 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-rdwcv" Apr 24 21:35:28.650448 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:28.650310 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-rdwcv"] Apr 24 21:35:28.655996 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:35:28.654475 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode680c157_9498_4ee6_ae79_941f7eb8f874.slice/crio-a1ba9d43416e3fa0ce6acaec83ceff6183c798ac52942a2e27e169d2d422a75e WatchSource:0}: Error finding container a1ba9d43416e3fa0ce6acaec83ceff6183c798ac52942a2e27e169d2d422a75e: Status 404 returned error can't find the container with id a1ba9d43416e3fa0ce6acaec83ceff6183c798ac52942a2e27e169d2d422a75e Apr 24 21:35:29.339995 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.339885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-rdwcv" event={"ID":"e680c157-9498-4ee6-ae79-941f7eb8f874","Type":"ContainerStarted","Data":"eafb6e2bffed004859696aefb73f577e2c3edb0eb41c6635c0b8a63a56a391fb"} Apr 24 21:35:29.339995 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.339934 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-rdwcv" event={"ID":"e680c157-9498-4ee6-ae79-941f7eb8f874","Type":"ContainerStarted","Data":"a1ba9d43416e3fa0ce6acaec83ceff6183c798ac52942a2e27e169d2d422a75e"} Apr 24 21:35:29.357964 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.357906 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-rdwcv" podStartSLOduration=0.963762453 podStartE2EDuration="1.357862739s" podCreationTimestamp="2026-04-24 21:35:28 +0000 UTC" firstStartedPulling="2026-04-24 21:35:28.656354121 +0000 UTC m=+542.640568878" lastFinishedPulling="2026-04-24 21:35:29.050454404 +0000 UTC m=+543.034669164" observedRunningTime="2026-04-24 21:35:29.356324373 +0000 UTC m=+543.340539152" watchObservedRunningTime="2026-04-24 21:35:29.357862739 +0000 UTC m=+543.342077586" Apr 24 21:35:29.378189 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.378152 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:29.378435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.378409 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-4phn2" podUID="84d1886c-72cb-4428-abfe-2902290a84e5" containerName="authorino" containerID="cri-o://1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752" gracePeriod=30 Apr 24 21:35:29.627106 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.627077 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:29.773219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.773183 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr7xv\" (UniqueName: \"kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv\") pod \"84d1886c-72cb-4428-abfe-2902290a84e5\" (UID: \"84d1886c-72cb-4428-abfe-2902290a84e5\") " Apr 24 21:35:29.775484 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.775450 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv" (OuterVolumeSpecName: "kube-api-access-xr7xv") pod "84d1886c-72cb-4428-abfe-2902290a84e5" (UID: "84d1886c-72cb-4428-abfe-2902290a84e5"). InnerVolumeSpecName "kube-api-access-xr7xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:35:29.874599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:29.874512 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xr7xv\" (UniqueName: \"kubernetes.io/projected/84d1886c-72cb-4428-abfe-2902290a84e5-kube-api-access-xr7xv\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:35:30.343787 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.343748 2580 generic.go:358] "Generic (PLEG): container finished" podID="84d1886c-72cb-4428-abfe-2902290a84e5" containerID="1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752" exitCode=0 Apr 24 21:35:30.344271 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.343804 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-4phn2" Apr 24 21:35:30.344271 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.343832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4phn2" event={"ID":"84d1886c-72cb-4428-abfe-2902290a84e5","Type":"ContainerDied","Data":"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752"} Apr 24 21:35:30.344271 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.343867 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-4phn2" event={"ID":"84d1886c-72cb-4428-abfe-2902290a84e5","Type":"ContainerDied","Data":"47ae453acfb55486c081bf0c53ddb66a8f60f821ab45dfc0a873074d83e05573"} Apr 24 21:35:30.344271 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.343882 2580 scope.go:117] "RemoveContainer" containerID="1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752" Apr 24 21:35:30.352396 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.352377 2580 scope.go:117] "RemoveContainer" containerID="1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752" Apr 24 21:35:30.352717 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:35:30.352699 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752\": container with ID starting with 1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752 not found: ID does not exist" containerID="1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752" Apr 24 21:35:30.352769 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.352728 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752"} err="failed to get container status \"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752\": rpc error: code = NotFound desc = could not find container \"1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752\": container with ID starting with 1085e040e373d8cf77509db8b4c3ee501111ec86204428822917973872a3b752 not found: ID does not exist" Apr 24 21:35:30.365865 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.365831 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:30.369647 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.369618 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-4phn2"] Apr 24 21:35:30.578154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:35:30.578113 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d1886c-72cb-4428-abfe-2902290a84e5" path="/var/lib/kubelet/pods/84d1886c-72cb-4428-abfe-2902290a84e5/volumes" Apr 24 21:36:26.487533 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:36:26.487502 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:36:26.488091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:36:26.487719 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:37:49.391461 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.391417 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:37:49.391890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.391755 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d1886c-72cb-4428-abfe-2902290a84e5" containerName="authorino" Apr 24 21:37:49.391890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.391769 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d1886c-72cb-4428-abfe-2902290a84e5" containerName="authorino" Apr 24 21:37:49.391890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.391827 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="84d1886c-72cb-4428-abfe-2902290a84e5" containerName="authorino" Apr 24 21:37:49.394524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.394509 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.397045 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.397023 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:37:49.397177 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.397024 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:37:49.397845 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.397829 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wjq2p\"" Apr 24 21:37:49.397918 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.397896 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 21:37:49.406529 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.406504 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:37:49.460830 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460790 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpl5\" (UniqueName: \"kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.460999 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.460999 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460908 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.460999 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460946 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.460999 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.460999 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.460982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.461158 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.461059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.561620 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.561824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpl5\" (UniqueName: \"kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.561824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.561824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.561824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562010 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562010 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.561882 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562241 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.562222 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562319 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.562249 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562378 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.562312 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.562378 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.562343 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.564195 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.564172 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.564406 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.564388 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.573917 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.573886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpl5\" (UniqueName: \"kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5\") pod \"scheduler-inline-config-test-kserve-6f547d47b5-7c75p\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.704524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.704491 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:37:49.867527 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.867494 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:37:49.871101 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:37:49.871066 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d32ca98_6b4b_442a_99ef_9c7fbda96444.slice/crio-64b55e7043b72f3c0676be53c2e353fd36486757824a1a4e26326ef68343b663 WatchSource:0}: Error finding container 64b55e7043b72f3c0676be53c2e353fd36486757824a1a4e26326ef68343b663: Status 404 returned error can't find the container with id 64b55e7043b72f3c0676be53c2e353fd36486757824a1a4e26326ef68343b663 Apr 24 21:37:49.873035 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:49.873016 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:37:50.765380 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:50.765336 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" event={"ID":"7d32ca98-6b4b-442a-99ef-9c7fbda96444","Type":"ContainerStarted","Data":"64b55e7043b72f3c0676be53c2e353fd36486757824a1a4e26326ef68343b663"} Apr 24 21:37:53.777432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:37:53.777364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" event={"ID":"7d32ca98-6b4b-442a-99ef-9c7fbda96444","Type":"ContainerStarted","Data":"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5"} Apr 24 21:38:03.781702 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.781645 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:03.790553 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.790521 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.793154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.793124 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 24 21:38:03.797534 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.797505 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:03.893449 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893405 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893449 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893727 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893727 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893532 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893727 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dl9\" (UniqueName: \"kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893727 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893626 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.893727 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.893711 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.994827 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.994790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.994827 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.994832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.994854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dl9\" (UniqueName: \"kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.994898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.994962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995006 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995086 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995334 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995277 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995334 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995401 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995361 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.995435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.995405 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.997546 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.997523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:03.997808 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:03.997791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:04.002822 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:04.002788 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dl9\" (UniqueName: \"kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:04.103247 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:04.103150 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:04.236161 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:04.236084 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:04.813354 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:04.813316 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" event={"ID":"4225fdf8-de3f-4431-a3a5-4fd1f360a659","Type":"ContainerStarted","Data":"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20"} Apr 24 21:38:04.813354 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:04.813354 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" event={"ID":"4225fdf8-de3f-4431-a3a5-4fd1f360a659","Type":"ContainerStarted","Data":"e990dcccaab6d972c4e2cccc85b32a8af59437b5d579a746a1cab8070f51a862"} Apr 24 21:38:21.000134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:21.000053 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:21.000621 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:21.000335 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" podUID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" containerName="storage-initializer" containerID="cri-o://51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20" gracePeriod=30 Apr 24 21:38:51.163884 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.163855 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7_4225fdf8-de3f-4431-a3a5-4fd1f360a659/storage-initializer/0.log" Apr 24 21:38:51.164216 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.163925 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:51.326028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.325935 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.325998 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326020 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326061 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326097 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4dl9\" (UniqueName: \"kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326140 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326177 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir\") pod \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\" (UID: \"4225fdf8-de3f-4431-a3a5-4fd1f360a659\") " Apr 24 21:38:51.326276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326225 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache" (OuterVolumeSpecName: "model-cache") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:51.326520 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326324 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home" (OuterVolumeSpecName: "home") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:51.326520 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326468 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:51.326520 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326483 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.326520 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.326504 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.328508 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.328483 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:38:51.328618 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.328549 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm" (OuterVolumeSpecName: "dshm") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:51.328924 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.328903 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9" (OuterVolumeSpecName: "kube-api-access-w4dl9") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "kube-api-access-w4dl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:38:51.362641 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.362588 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4225fdf8-de3f-4431-a3a5-4fd1f360a659" (UID: "4225fdf8-de3f-4431-a3a5-4fd1f360a659"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:38:51.427069 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.427024 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w4dl9\" (UniqueName: \"kubernetes.io/projected/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kube-api-access-w4dl9\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.427069 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.427063 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.427069 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.427077 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.427305 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.427089 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.427305 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.427100 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4225fdf8-de3f-4431-a3a5-4fd1f360a659-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:38:51.980257 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980226 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7_4225fdf8-de3f-4431-a3a5-4fd1f360a659/storage-initializer/0.log" Apr 24 21:38:51.980502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980268 2580 generic.go:358] "Generic (PLEG): container finished" podID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" containerID="51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20" exitCode=137 Apr 24 21:38:51.980502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" event={"ID":"4225fdf8-de3f-4431-a3a5-4fd1f360a659","Type":"ContainerDied","Data":"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20"} Apr 24 21:38:51.980502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980348 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" Apr 24 21:38:51.980502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980365 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7" event={"ID":"4225fdf8-de3f-4431-a3a5-4fd1f360a659","Type":"ContainerDied","Data":"e990dcccaab6d972c4e2cccc85b32a8af59437b5d579a746a1cab8070f51a862"} Apr 24 21:38:51.980502 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.980380 2580 scope.go:117] "RemoveContainer" containerID="51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20" Apr 24 21:38:51.999558 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.999536 2580 scope.go:117] "RemoveContainer" containerID="51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20" Apr 24 21:38:51.999887 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:38:51.999863 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20\": container with ID starting with 51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20 not found: ID does not exist" containerID="51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20" Apr 24 21:38:51.999944 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:51.999898 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20"} err="failed to get container status \"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20\": rpc error: code = NotFound desc = could not find container \"51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20\": container with ID starting with 51393e75392b1fa4101b207e919aea9ca0eb56b911c3f58f288da7acd9b00a20 not found: ID does not exist" Apr 24 21:38:52.017100 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:52.017063 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:52.019984 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:52.019946 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-69684b5889jrhk7"] Apr 24 21:38:52.578382 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:38:52.578349 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" path="/var/lib/kubelet/pods/4225fdf8-de3f-4431-a3a5-4fd1f360a659/volumes" Apr 24 21:40:21.113113 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.113065 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:40:21.113557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.113449 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" containerName="storage-initializer" Apr 24 21:40:21.113557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.113462 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" containerName="storage-initializer" Apr 24 21:40:21.113557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.113534 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4225fdf8-de3f-4431-a3a5-4fd1f360a659" containerName="storage-initializer" Apr 24 21:40:21.116631 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.116608 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.122204 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.122171 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:40:21.128787 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.128755 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:40:21.169723 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169680 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.169893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.169893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.169893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169812 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.169893 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169844 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.170019 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szdq\" (UniqueName: \"kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.170019 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.169936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.270911 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.270874 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.270917 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.270952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.270980 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7szdq\" (UniqueName: \"kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271062 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271098 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271472 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271556 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271556 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.271639 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.271596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.273440 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.273424 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.273708 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.273693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.279853 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.279830 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szdq\" (UniqueName: \"kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.427896 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.427860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:40:21.562562 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:21.562510 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:40:21.564076 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:40:21.564040 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1640545b_f73d_468e_9905_74f596109ab7.slice/crio-bffc5a0936b261a1fd5b8f2ad10b9bd2abc372a9beb6a773d49f88b30c53ef11 WatchSource:0}: Error finding container bffc5a0936b261a1fd5b8f2ad10b9bd2abc372a9beb6a773d49f88b30c53ef11: Status 404 returned error can't find the container with id bffc5a0936b261a1fd5b8f2ad10b9bd2abc372a9beb6a773d49f88b30c53ef11 Apr 24 21:40:22.267446 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:22.267405 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerStarted","Data":"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8"} Apr 24 21:40:22.267446 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:22.267447 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerStarted","Data":"bffc5a0936b261a1fd5b8f2ad10b9bd2abc372a9beb6a773d49f88b30c53ef11"} Apr 24 21:40:26.283091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:26.283055 2580 generic.go:358] "Generic (PLEG): container finished" podID="1640545b-f73d-468e-9905-74f596109ab7" containerID="b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8" exitCode=0 Apr 24 21:40:26.283471 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:26.283131 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerDied","Data":"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8"} Apr 24 21:40:54.399949 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:54.399908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerStarted","Data":"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a"} Apr 24 21:40:54.432919 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:40:54.432848 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podStartSLOduration=5.838119378 podStartE2EDuration="33.432830369s" podCreationTimestamp="2026-04-24 21:40:21 +0000 UTC" firstStartedPulling="2026-04-24 21:40:26.284279944 +0000 UTC m=+840.268494701" lastFinishedPulling="2026-04-24 21:40:53.878990917 +0000 UTC m=+867.863205692" observedRunningTime="2026-04-24 21:40:54.42961836 +0000 UTC m=+868.413833139" watchObservedRunningTime="2026-04-24 21:40:54.432830369 +0000 UTC m=+868.417045148" Apr 24 21:41:01.428523 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:01.428492 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:41:01.428523 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:01.428532 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:41:01.430193 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:01.430155 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:41:11.429253 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:11.429151 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:41:21.428639 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:21.428595 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:41:26.510412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:26.510382 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:41:26.510948 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:26.510890 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:41:31.428541 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:31.428483 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:41:41.429131 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:41.429085 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:41:51.428542 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:41:51.428496 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:42:01.428338 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:01.428295 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:42:11.428504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:11.428447 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:42:21.428954 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:21.428900 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.30:8000/health\": dial tcp 10.132.0.30:8000: connect: connection refused" Apr 24 21:42:31.439054 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:31.439014 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:42:31.447319 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:31.447279 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:42:37.515789 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:37.515747 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:42:37.516329 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:37.516132 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" containerID="cri-o://13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a" gracePeriod=30 Apr 24 21:42:45.501718 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.501624 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:42:45.506559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.506536 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.509021 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.508983 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 21:42:45.522463 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.522431 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:42:45.563290 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563290 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563296 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563370 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhk7c\" (UniqueName: \"kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.563504 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.563449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.664689 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.664689 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665013 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664716 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665013 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665013 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664796 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665013 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhk7c\" (UniqueName: \"kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665013 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.664881 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665272 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.665233 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665327 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.665272 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665327 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.665317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.665465 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.665445 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.667436 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.667404 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.667844 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.667820 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.673646 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.673618 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhk7c\" (UniqueName: \"kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c\") pod \"custom-route-timeout-test-kserve-74fbd5dd49-f5v65\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.817312 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.817218 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:45.953339 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:45.953278 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:42:45.956444 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:42:45.956401 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0fb8999_756b_4d2e_9dec_6a7c12345abc.slice/crio-4d6619942dace8e0e79cdc7809a28e586537860fc3f61e909e50f016ef94baa9 WatchSource:0}: Error finding container 4d6619942dace8e0e79cdc7809a28e586537860fc3f61e909e50f016ef94baa9: Status 404 returned error can't find the container with id 4d6619942dace8e0e79cdc7809a28e586537860fc3f61e909e50f016ef94baa9 Apr 24 21:42:46.781721 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:46.781679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerStarted","Data":"ba601907f3e2a5b0bfd3175297b935fc43caef7d40fcd789a862565408e454da"} Apr 24 21:42:46.781721 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:46.781717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerStarted","Data":"4d6619942dace8e0e79cdc7809a28e586537860fc3f61e909e50f016ef94baa9"} Apr 24 21:42:50.797288 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:50.797249 2580 generic.go:358] "Generic (PLEG): container finished" podID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerID="ba601907f3e2a5b0bfd3175297b935fc43caef7d40fcd789a862565408e454da" exitCode=0 Apr 24 21:42:50.797682 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:50.797324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerDied","Data":"ba601907f3e2a5b0bfd3175297b935fc43caef7d40fcd789a862565408e454da"} Apr 24 21:42:50.798497 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:50.798441 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:42:51.805089 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:51.805053 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerStarted","Data":"540a55bb9769326978b3d0f590e75918c0d457e1e9b597a5daa4458739eb65b0"} Apr 24 21:42:51.833248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:51.833180 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podStartSLOduration=6.833161397 podStartE2EDuration="6.833161397s" podCreationTimestamp="2026-04-24 21:42:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:42:51.831861916 +0000 UTC m=+985.816076725" watchObservedRunningTime="2026-04-24 21:42:51.833161397 +0000 UTC m=+985.817376176" Apr 24 21:42:55.817565 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:55.817531 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:55.817565 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:55.817573 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:42:55.819273 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:42:55.819237 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:05.818452 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:05.818398 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:07.788787 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.788757 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w_1640545b-f73d-468e-9905-74f596109ab7/main/0.log" Apr 24 21:43:07.789186 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.789169 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:43:07.858911 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.858880 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w_1640545b-f73d-468e-9905-74f596109ab7/main/0.log" Apr 24 21:43:07.859280 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.859245 2580 generic.go:358] "Generic (PLEG): container finished" podID="1640545b-f73d-468e-9905-74f596109ab7" containerID="13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a" exitCode=137 Apr 24 21:43:07.859419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.859284 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerDied","Data":"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a"} Apr 24 21:43:07.859419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.859325 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" Apr 24 21:43:07.859419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.859335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w" event={"ID":"1640545b-f73d-468e-9905-74f596109ab7","Type":"ContainerDied","Data":"bffc5a0936b261a1fd5b8f2ad10b9bd2abc372a9beb6a773d49f88b30c53ef11"} Apr 24 21:43:07.859419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.859360 2580 scope.go:117] "RemoveContainer" containerID="13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a" Apr 24 21:43:07.867519 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867490 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867554 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867585 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867619 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867687 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867732 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.867791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867756 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szdq\" (UniqueName: \"kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq\") pod \"1640545b-f73d-468e-9905-74f596109ab7\" (UID: \"1640545b-f73d-468e-9905-74f596109ab7\") " Apr 24 21:43:07.868091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867946 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache" (OuterVolumeSpecName: "model-cache") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:07.868091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.867731 2580 scope.go:117] "RemoveContainer" containerID="b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8" Apr 24 21:43:07.868091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.868044 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.868708 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.868675 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home" (OuterVolumeSpecName: "home") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:07.870647 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.870619 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:07.870647 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.870628 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq" (OuterVolumeSpecName: "kube-api-access-7szdq") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "kube-api-access-7szdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:07.871155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.871135 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm" (OuterVolumeSpecName: "dshm") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:07.888895 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.888831 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:07.938545 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.938493 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1640545b-f73d-468e-9905-74f596109ab7" (UID: "1640545b-f73d-468e-9905-74f596109ab7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:07.953381 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.953355 2580 scope.go:117] "RemoveContainer" containerID="13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a" Apr 24 21:43:07.953714 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:43:07.953693 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a\": container with ID starting with 13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a not found: ID does not exist" containerID="13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a" Apr 24 21:43:07.953775 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.953725 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a"} err="failed to get container status \"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a\": rpc error: code = NotFound desc = could not find container \"13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a\": container with ID starting with 13135bdc0178052f7f3318028e22b3f955562d908ae90c4c824c80cd9d171d9a not found: ID does not exist" Apr 24 21:43:07.953775 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.953744 2580 scope.go:117] "RemoveContainer" containerID="b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8" Apr 24 21:43:07.954005 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:43:07.953991 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8\": container with ID starting with b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8 not found: ID does not exist" containerID="b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8" Apr 24 21:43:07.954062 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.954009 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8"} err="failed to get container status \"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8\": rpc error: code = NotFound desc = could not find container \"b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8\": container with ID starting with b519330b109d05238fb435894a68a7b6c132ae183d641713763d472b197081d8 not found: ID does not exist" Apr 24 21:43:07.968470 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968444 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.968470 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968470 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7szdq\" (UniqueName: \"kubernetes.io/projected/1640545b-f73d-468e-9905-74f596109ab7-kube-api-access-7szdq\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.968645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968485 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1640545b-f73d-468e-9905-74f596109ab7-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.968645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968496 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.968645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968505 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:07.968645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:07.968513 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1640545b-f73d-468e-9905-74f596109ab7-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:43:08.181998 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:08.181955 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:43:08.188275 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:08.188231 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-7f5bc589c7bws5w"] Apr 24 21:43:08.578548 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:08.578465 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1640545b-f73d-468e-9905-74f596109ab7" path="/var/lib/kubelet/pods/1640545b-f73d-468e-9905-74f596109ab7/volumes" Apr 24 21:43:15.818096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:15.818024 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:25.818184 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:25.818137 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:35.818068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:35.818012 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:45.818730 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:45.818644 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:43:55.818426 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:43:55.818359 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:44:05.818252 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:05.818202 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:44:15.818595 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:15.818536 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:44:25.818030 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:25.817980 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:44:35.828238 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:35.828205 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:44:35.835994 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:35.835968 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:44:42.053300 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:42.053260 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:44:42.053817 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:42.053571 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" containerID="cri-o://540a55bb9769326978b3d0f590e75918c0d457e1e9b597a5daa4458739eb65b0" gracePeriod=30 Apr 24 21:44:50.687492 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.687451 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:44:50.687966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.687949 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="storage-initializer" Apr 24 21:44:50.688028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.687970 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="storage-initializer" Apr 24 21:44:50.688028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.687985 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" Apr 24 21:44:50.688028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.687993 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" Apr 24 21:44:50.688123 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.688078 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="1640545b-f73d-468e-9905-74f596109ab7" containerName="main" Apr 24 21:44:50.691220 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.691198 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.693743 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.693709 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:44:50.700831 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.700801 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:44:50.770240 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rhq\" (UniqueName: \"kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770434 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770257 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770434 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770434 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770322 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770434 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770369 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770608 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.770608 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.770504 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871107 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871147 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871200 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871234 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871316 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rhq\" (UniqueName: \"kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871593 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871526 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871593 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871556 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871726 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871595 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.871783 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.871741 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.873716 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.873692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.874050 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.874028 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:50.879283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:50.879263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rhq\" (UniqueName: \"kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq\") pod \"router-with-refs-test-kserve-55c8cb8477-vcz7b\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:51.002539 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:51.002438 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:44:51.142311 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:51.142225 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:44:51.145516 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:44:51.145480 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81c3e1d_2429_457d_83d8_cabdf3d8e2ca.slice/crio-05e1068a14d362b0088278b216ea3094ae25eed9c9e887c1c66a2dafb0f347f3 WatchSource:0}: Error finding container 05e1068a14d362b0088278b216ea3094ae25eed9c9e887c1c66a2dafb0f347f3: Status 404 returned error can't find the container with id 05e1068a14d362b0088278b216ea3094ae25eed9c9e887c1c66a2dafb0f347f3 Apr 24 21:44:51.218526 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:51.218492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerStarted","Data":"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31"} Apr 24 21:44:51.218632 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:51.218534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerStarted","Data":"05e1068a14d362b0088278b216ea3094ae25eed9c9e887c1c66a2dafb0f347f3"} Apr 24 21:44:55.234146 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:55.234108 2580 generic.go:358] "Generic (PLEG): container finished" podID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerID="4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31" exitCode=0 Apr 24 21:44:55.234506 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:55.234185 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerDied","Data":"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31"} Apr 24 21:44:56.239945 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:56.239908 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerStarted","Data":"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4"} Apr 24 21:44:56.262934 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:44:56.262866 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podStartSLOduration=6.26284525 podStartE2EDuration="6.26284525s" podCreationTimestamp="2026-04-24 21:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:44:56.261526911 +0000 UTC m=+1110.245741727" watchObservedRunningTime="2026-04-24 21:44:56.26284525 +0000 UTC m=+1110.247060030" Apr 24 21:45:01.003366 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:01.003316 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:45:01.003979 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:01.003418 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:45:01.004858 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:01.004823 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:45:11.003798 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:11.003754 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:45:12.295703 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.295651 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-74fbd5dd49-f5v65_b0fb8999-756b-4d2e-9dec-6a7c12345abc/main/0.log" Apr 24 21:45:12.296117 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.296044 2580 generic.go:358] "Generic (PLEG): container finished" podID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerID="540a55bb9769326978b3d0f590e75918c0d457e1e9b597a5daa4458739eb65b0" exitCode=137 Apr 24 21:45:12.296169 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.296123 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerDied","Data":"540a55bb9769326978b3d0f590e75918c0d457e1e9b597a5daa4458739eb65b0"} Apr 24 21:45:12.354564 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.354529 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-74fbd5dd49-f5v65_b0fb8999-756b-4d2e-9dec-6a7c12345abc/main/0.log" Apr 24 21:45:12.355027 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.355007 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:45:12.469273 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469234 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469475 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469305 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469475 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469332 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469475 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469374 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469475 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469456 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469719 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469505 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhk7c\" (UniqueName: \"kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469719 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469552 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location\") pod \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\" (UID: \"b0fb8999-756b-4d2e-9dec-6a7c12345abc\") " Apr 24 21:45:12.469719 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469687 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache" (OuterVolumeSpecName: "model-cache") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.469884 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home" (OuterVolumeSpecName: "home") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.469931 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.469868 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.472161 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.472096 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm" (OuterVolumeSpecName: "dshm") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.472511 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.472481 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c" (OuterVolumeSpecName: "kube-api-access-bhk7c") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "kube-api-access-bhk7c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:12.472744 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.472721 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:12.487726 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.487681 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.525813 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.525764 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0fb8999-756b-4d2e-9dec-6a7c12345abc" (UID: "b0fb8999-756b-4d2e-9dec-6a7c12345abc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:12.570898 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570858 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.570898 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570891 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bhk7c\" (UniqueName: \"kubernetes.io/projected/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kube-api-access-bhk7c\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.570898 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570903 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.571130 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570912 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.571130 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570921 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:12.571130 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:12.570930 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b0fb8999-756b-4d2e-9dec-6a7c12345abc-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:45:13.300404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.300374 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-74fbd5dd49-f5v65_b0fb8999-756b-4d2e-9dec-6a7c12345abc/main/0.log" Apr 24 21:45:13.300875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.300847 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" event={"ID":"b0fb8999-756b-4d2e-9dec-6a7c12345abc","Type":"ContainerDied","Data":"4d6619942dace8e0e79cdc7809a28e586537860fc3f61e909e50f016ef94baa9"} Apr 24 21:45:13.300941 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.300906 2580 scope.go:117] "RemoveContainer" containerID="540a55bb9769326978b3d0f590e75918c0d457e1e9b597a5daa4458739eb65b0" Apr 24 21:45:13.300993 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.300855 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65" Apr 24 21:45:13.317173 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.317135 2580 scope.go:117] "RemoveContainer" containerID="ba601907f3e2a5b0bfd3175297b935fc43caef7d40fcd789a862565408e454da" Apr 24 21:45:13.323726 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.323690 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:45:13.327866 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:13.327838 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74fbd5dd49-f5v65"] Apr 24 21:45:14.579237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:14.579189 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" path="/var/lib/kubelet/pods/b0fb8999-756b-4d2e-9dec-6a7c12345abc/volumes" Apr 24 21:45:21.003850 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:21.003795 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:45:31.003134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:31.003081 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:45:41.003894 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:41.003784 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:45:51.003712 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:45:51.003627 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:46:01.003478 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:01.003434 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:46:11.003091 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:11.003049 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:46:21.003908 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:21.003857 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:46:26.532930 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:26.532901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:46:26.534152 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:26.534130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:46:31.003796 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:31.003752 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 24 21:46:41.012767 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:41.012730 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:46:41.020534 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:41.020510 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:46:46.847233 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:46.847184 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:46:46.847683 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:46:46.847493 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" containerID="cri-o://15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4" gracePeriod=30 Apr 24 21:47:03.897039 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.896996 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:47:03.897612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.897495 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" Apr 24 21:47:03.897612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.897517 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" Apr 24 21:47:03.897612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.897542 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="storage-initializer" Apr 24 21:47:03.897612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.897552 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="storage-initializer" Apr 24 21:47:03.897860 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.897635 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0fb8999-756b-4d2e-9dec-6a7c12345abc" containerName="main" Apr 24 21:47:03.901200 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.901176 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:03.903738 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.903716 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-czvl8\"" Apr 24 21:47:03.903870 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.903834 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:47:03.909414 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.909376 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:47:03.913001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.912977 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:47:03.913126 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.913113 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:03.928888 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:03.928853 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:47:04.036016 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.035967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036016 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjfmw\" (UniqueName: \"kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036087 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qrqs\" (UniqueName: \"kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036114 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036230 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036144 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036385 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036235 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036385 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036267 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036385 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036385 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036335 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036505 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.036505 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036415 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.036505 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.036436 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137295 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137255 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137295 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137294 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qrqs\" (UniqueName: \"kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137378 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137416 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137524 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137439 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137825 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137695 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137825 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137731 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137825 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137881 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjfmw\" (UniqueName: \"kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137930 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137952 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.137974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.137734 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.138361 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.138198 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.138417 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.138372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.138470 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.138414 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.138567 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.138493 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.140299 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.140271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.140455 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.140438 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.140637 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.140614 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.140749 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.140736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.145506 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.145482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qrqs\" (UniqueName: \"kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.145608 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.145570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjfmw\" (UniqueName: \"kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.212442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.212355 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:04.222332 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.222301 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:04.360601 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.360463 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:47:04.367322 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:47:04.367289 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode446da05_1992_4e35_873d_4b41352c46a6.slice/crio-c9386ba1b9396c42c2d4a7af61f9023373feddc065867ae870e565ea2451c798 WatchSource:0}: Error finding container c9386ba1b9396c42c2d4a7af61f9023373feddc065867ae870e565ea2451c798: Status 404 returned error can't find the container with id c9386ba1b9396c42c2d4a7af61f9023373feddc065867ae870e565ea2451c798 Apr 24 21:47:04.396189 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.396166 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:47:04.398220 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:47:04.398184 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44da7e50_aeb4_4fbf_80a1_31f697b4b936.slice/crio-47bf6bb0b796d9b39f8623a2b44ebb6d769d9fdac325a20587cee609098ccb82 WatchSource:0}: Error finding container 47bf6bb0b796d9b39f8623a2b44ebb6d769d9fdac325a20587cee609098ccb82: Status 404 returned error can't find the container with id 47bf6bb0b796d9b39f8623a2b44ebb6d769d9fdac325a20587cee609098ccb82 Apr 24 21:47:04.678527 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.678444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerStarted","Data":"c9386ba1b9396c42c2d4a7af61f9023373feddc065867ae870e565ea2451c798"} Apr 24 21:47:04.680930 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.680596 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerStarted","Data":"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc"} Apr 24 21:47:04.680930 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:04.680637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerStarted","Data":"47bf6bb0b796d9b39f8623a2b44ebb6d769d9fdac325a20587cee609098ccb82"} Apr 24 21:47:05.686897 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:05.686843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerStarted","Data":"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09"} Apr 24 21:47:05.687354 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:05.686966 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:06.697820 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:06.697769 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerStarted","Data":"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e"} Apr 24 21:47:09.717710 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:09.717613 2580 generic.go:358] "Generic (PLEG): container finished" podID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerID="2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc" exitCode=0 Apr 24 21:47:09.718164 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:09.717699 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerDied","Data":"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc"} Apr 24 21:47:10.724163 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:10.724123 2580 generic.go:358] "Generic (PLEG): container finished" podID="e446da05-1992-4e35-873d-4b41352c46a6" containerID="57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e" exitCode=0 Apr 24 21:47:10.724637 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:10.724192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerDied","Data":"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e"} Apr 24 21:47:10.726219 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:10.726193 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerStarted","Data":"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d"} Apr 24 21:47:10.775740 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:10.775451 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podStartSLOduration=7.77543042 podStartE2EDuration="7.77543042s" podCreationTimestamp="2026-04-24 21:47:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:10.772793928 +0000 UTC m=+1244.757008706" watchObservedRunningTime="2026-04-24 21:47:10.77543042 +0000 UTC m=+1244.759645200" Apr 24 21:47:11.732183 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:11.732138 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerStarted","Data":"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640"} Apr 24 21:47:11.758957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:11.758896 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podStartSLOduration=7.807637643 podStartE2EDuration="8.758879228s" podCreationTimestamp="2026-04-24 21:47:03 +0000 UTC" firstStartedPulling="2026-04-24 21:47:04.369064963 +0000 UTC m=+1238.353279720" lastFinishedPulling="2026-04-24 21:47:05.320306546 +0000 UTC m=+1239.304521305" observedRunningTime="2026-04-24 21:47:11.756647411 +0000 UTC m=+1245.740862192" watchObservedRunningTime="2026-04-24 21:47:11.758879228 +0000 UTC m=+1245.743094007" Apr 24 21:47:14.212847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.212807 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:14.212847 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.212853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:14.214644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.214610 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:47:14.222525 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.222498 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:14.222707 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.222541 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:47:14.224068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:14.224039 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:47:17.125408 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.125373 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-55c8cb8477-vcz7b_a81c3e1d-2429-457d-83d8-cabdf3d8e2ca/main/0.log" Apr 24 21:47:17.125866 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.125794 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:47:17.181021 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.180978 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rhq\" (UniqueName: \"kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181226 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181076 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181226 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181119 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181226 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181148 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181226 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181190 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181454 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181328 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache" (OuterVolumeSpecName: "model-cache") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:17.181454 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181390 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.181561 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181459 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs\") pod \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\" (UID: \"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca\") " Apr 24 21:47:17.182278 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181968 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home" (OuterVolumeSpecName: "home") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:17.182278 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.181992 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.184442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.184413 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:17.197516 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.197471 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:17.197953 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.197927 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq" (OuterVolumeSpecName: "kube-api-access-z5rhq") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "kube-api-access-z5rhq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:47:17.199492 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.199464 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm" (OuterVolumeSpecName: "dshm") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:17.252691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.252623 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" (UID: "a81c3e1d-2429-457d-83d8-cabdf3d8e2ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:47:17.283499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283462 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5rhq\" (UniqueName: \"kubernetes.io/projected/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kube-api-access-z5rhq\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.283499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283496 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.283706 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283508 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.283706 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283518 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.283706 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283527 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.283706 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.283534 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:47:17.758381 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758356 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-55c8cb8477-vcz7b_a81c3e1d-2429-457d-83d8-cabdf3d8e2ca/main/0.log" Apr 24 21:47:17.758824 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758790 2580 generic.go:358] "Generic (PLEG): container finished" podID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerID="15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4" exitCode=137 Apr 24 21:47:17.758957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerDied","Data":"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4"} Apr 24 21:47:17.758957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758876 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" Apr 24 21:47:17.758957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758906 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b" event={"ID":"a81c3e1d-2429-457d-83d8-cabdf3d8e2ca","Type":"ContainerDied","Data":"05e1068a14d362b0088278b216ea3094ae25eed9c9e887c1c66a2dafb0f347f3"} Apr 24 21:47:17.758957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.758933 2580 scope.go:117] "RemoveContainer" containerID="15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4" Apr 24 21:47:17.775162 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.775134 2580 scope.go:117] "RemoveContainer" containerID="4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31" Apr 24 21:47:17.787538 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.787502 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:47:17.790585 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.790553 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-55c8cb8477-vcz7b"] Apr 24 21:47:17.794626 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.794588 2580 scope.go:117] "RemoveContainer" containerID="15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4" Apr 24 21:47:17.795096 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:47:17.795067 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4\": container with ID starting with 15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4 not found: ID does not exist" containerID="15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4" Apr 24 21:47:17.795203 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.795112 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4"} err="failed to get container status \"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4\": rpc error: code = NotFound desc = could not find container \"15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4\": container with ID starting with 15fdf84b9e9dba31ffe53bacb7db723f6775b235cb5c72b30b2ae7ed78d9a6b4 not found: ID does not exist" Apr 24 21:47:17.795203 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.795141 2580 scope.go:117] "RemoveContainer" containerID="4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31" Apr 24 21:47:17.795471 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:47:17.795452 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31\": container with ID starting with 4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31 not found: ID does not exist" containerID="4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31" Apr 24 21:47:17.795536 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:17.795489 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31"} err="failed to get container status \"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31\": rpc error: code = NotFound desc = could not find container \"4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31\": container with ID starting with 4e0801aa59333cb359337cd723e02b9ac88db3fcf592c19ed4e31adf340b6c31 not found: ID does not exist" Apr 24 21:47:18.579386 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:18.579348 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" path="/var/lib/kubelet/pods/a81c3e1d-2429-457d-83d8-cabdf3d8e2ca/volumes" Apr 24 21:47:24.213062 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:24.213005 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:47:24.223311 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:24.223267 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:47:24.225231 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:24.225205 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:47:34.213960 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:34.213777 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:47:34.223515 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:34.223462 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:47:44.213056 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:44.213004 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:47:44.223135 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:44.223081 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:47:54.213645 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:54.213589 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:47:54.223191 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:47:54.223146 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:04.214012 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:04.213563 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:04.223326 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:04.223282 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:14.212982 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:14.212938 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:14.224350 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:14.224313 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:24.213833 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:24.213781 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:24.223399 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:24.223364 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:34.213288 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:34.213222 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:34.223150 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:34.223105 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:44.213478 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:44.213350 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:44.224161 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:44.223797 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:54.213203 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:54.213138 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:48:54.223335 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:48:54.223298 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:04.212837 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:04.212781 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:49:04.223066 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:04.223028 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:14.213347 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:14.213298 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8001/health\": dial tcp 10.132.0.33:8001: connect: connection refused" Apr 24 21:49:14.222962 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:14.222920 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:24.223290 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:24.223222 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:24.229028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:24.228996 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:49:24.241395 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:24.241353 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:49:34.234619 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:34.234570 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:49:34.243675 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:34.243625 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:49:42.652760 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:42.652726 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:49:42.653204 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:42.653014 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" containerID="cri-o://e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d" gracePeriod=30 Apr 24 21:49:42.658621 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:42.658585 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:49:42.658989 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:42.658945 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" containerID="cri-o://4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" gracePeriod=30 Apr 24 21:49:50.311781 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.311738 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 21:49:50.312217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.312068 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" Apr 24 21:49:50.312217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.312079 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" Apr 24 21:49:50.312217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.312096 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="storage-initializer" Apr 24 21:49:50.312217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.312102 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="storage-initializer" Apr 24 21:49:50.312217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.312158 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a81c3e1d-2429-457d-83d8-cabdf3d8e2ca" containerName="main" Apr 24 21:49:50.315693 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.315649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.319071 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.319047 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-p4dfp\"" Apr 24 21:49:50.319429 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.319412 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:49:50.335467 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.335431 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 21:49:50.337104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.337062 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 21:49:50.340983 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.340954 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.355053 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.355025 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 21:49:50.377082 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5tgf\" (UniqueName: \"kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377104 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377261 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377243 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.377453 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.377281 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478199 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478199 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478198 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478463 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478463 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478570 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478476 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlbc\" (UniqueName: \"kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478570 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478507 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478570 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478539 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478766 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478766 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478766 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478712 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478766 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478780 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478787 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.478966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5tgf\" (UniqueName: \"kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.478966 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.478943 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.479155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.479050 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.479155 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.479074 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.480815 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.480783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.481006 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.480989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.487612 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.487587 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5tgf\" (UniqueName: \"kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf\") pod \"custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.579437 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579437 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579422 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579444 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlbc\" (UniqueName: \"kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579463 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579491 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579963 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579963 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.579963 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.579899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.580163 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.580143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.582580 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.582544 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.582754 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.582600 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.593956 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.593923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlbc\" (UniqueName: \"kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc\") pod \"custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.629181 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.629146 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:50.652674 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.652624 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:49:50.789081 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.789019 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 21:49:50.791848 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:49:50.791810 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7f0592c_64c4_4c25_a8b1_81653e9bf9a0.slice/crio-68188c01f5d86a5d5693c1d8bdbb005379a4b0e68c3e8cb2f9d2e433136157a6 WatchSource:0}: Error finding container 68188c01f5d86a5d5693c1d8bdbb005379a4b0e68c3e8cb2f9d2e433136157a6: Status 404 returned error can't find the container with id 68188c01f5d86a5d5693c1d8bdbb005379a4b0e68c3e8cb2f9d2e433136157a6 Apr 24 21:49:50.793816 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.793794 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:49:50.810732 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:50.810706 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 21:49:50.813479 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:49:50.813449 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod909fa8c1_24be_412e_bfb4_b9daa8b30bad.slice/crio-f99eca8c9c918035ea93be64d54efc444320302320f3ff3d35f2bee66ecf34be WatchSource:0}: Error finding container f99eca8c9c918035ea93be64d54efc444320302320f3ff3d35f2bee66ecf34be: Status 404 returned error can't find the container with id f99eca8c9c918035ea93be64d54efc444320302320f3ff3d35f2bee66ecf34be Apr 24 21:49:51.330574 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:51.330536 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerStarted","Data":"66defc4ba9cc14af6217e902ce6aa26cc12df163f86968f7f3f358a0fe7f7165"} Apr 24 21:49:51.330574 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:51.330581 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerStarted","Data":"68188c01f5d86a5d5693c1d8bdbb005379a4b0e68c3e8cb2f9d2e433136157a6"} Apr 24 21:49:51.331141 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:51.330624 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:49:51.331937 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:51.331915 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerStarted","Data":"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70"} Apr 24 21:49:51.332040 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:51.331944 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerStarted","Data":"f99eca8c9c918035ea93be64d54efc444320302320f3ff3d35f2bee66ecf34be"} Apr 24 21:49:52.348005 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:49:52.347957 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerStarted","Data":"0f3c575d34b109930a0b2b36f72e058f25ef1381ec2b9b3266ec15c4dd082a7a"} Apr 24 21:50:03.364572 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:03.364530 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 21:50:12.659207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:12.659148 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="llm-d-routing-sidecar" containerID="cri-o://5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" gracePeriod=2 Apr 24 21:50:13.058339 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.058315 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7_e446da05-1992-4e35-873d-4b41352c46a6/main/0.log" Apr 24 21:50:13.059104 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.059084 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:50:13.062084 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.062063 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:50:13.184836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184799 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.184836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184848 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184865 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184890 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184912 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184935 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.184993 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185023 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185096 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185061 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185111 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185173 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qrqs\" (UniqueName: \"kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185166 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache" (OuterVolumeSpecName: "model-cache") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185230 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjfmw\" (UniqueName: \"kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185261 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir\") pod \"e446da05-1992-4e35-873d-4b41352c46a6\" (UID: \"e446da05-1992-4e35-873d-4b41352c46a6\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185291 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs\") pod \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\" (UID: \"44da7e50-aeb4-4fbf-80a1-31f697b4b936\") " Apr 24 21:50:13.185443 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185381 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache" (OuterVolumeSpecName: "model-cache") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.185818 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185575 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.185818 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185592 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.185921 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185891 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home" (OuterVolumeSpecName: "home") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.185921 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.185892 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home" (OuterVolumeSpecName: "home") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.188125 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.188092 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm" (OuterVolumeSpecName: "dshm") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.188464 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.188428 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm" (OuterVolumeSpecName: "dshm") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.188905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.188698 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs" (OuterVolumeSpecName: "kube-api-access-6qrqs") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "kube-api-access-6qrqs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:13.188905 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.188868 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:13.190078 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.190051 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:13.190383 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.190358 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw" (OuterVolumeSpecName: "kube-api-access-gjfmw") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "kube-api-access-gjfmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:13.201974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.201911 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.207023 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.206993 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.249182 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.249139 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e446da05-1992-4e35-873d-4b41352c46a6" (UID: "e446da05-1992-4e35-873d-4b41352c46a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.253791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.253763 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "44da7e50-aeb4-4fbf-80a1-31f697b4b936" (UID: "44da7e50-aeb4-4fbf-80a1-31f697b4b936"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:13.286197 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286161 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qrqs\" (UniqueName: \"kubernetes.io/projected/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kube-api-access-6qrqs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286197 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286192 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjfmw\" (UniqueName: \"kubernetes.io/projected/e446da05-1992-4e35-873d-4b41352c46a6-kube-api-access-gjfmw\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286197 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286204 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286214 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286222 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286231 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286240 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286249 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286257 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286265 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e446da05-1992-4e35-873d-4b41352c46a6-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286273 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e446da05-1992-4e35-873d-4b41352c46a6-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.286422 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.286280 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/44da7e50-aeb4-4fbf-80a1-31f697b4b936-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:50:13.425710 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.425681 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7_e446da05-1992-4e35-873d-4b41352c46a6/main/0.log" Apr 24 21:50:13.426276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426249 2580 generic.go:358] "Generic (PLEG): container finished" podID="e446da05-1992-4e35-873d-4b41352c46a6" containerID="4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" exitCode=137 Apr 24 21:50:13.426276 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426276 2580 generic.go:358] "Generic (PLEG): container finished" podID="e446da05-1992-4e35-873d-4b41352c46a6" containerID="5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" exitCode=0 Apr 24 21:50:13.426418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerDied","Data":"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640"} Apr 24 21:50:13.426418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426341 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" Apr 24 21:50:13.426418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426373 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerDied","Data":"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09"} Apr 24 21:50:13.426418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426388 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7" event={"ID":"e446da05-1992-4e35-873d-4b41352c46a6","Type":"ContainerDied","Data":"c9386ba1b9396c42c2d4a7af61f9023373feddc065867ae870e565ea2451c798"} Apr 24 21:50:13.426418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.426407 2580 scope.go:117] "RemoveContainer" containerID="4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" Apr 24 21:50:13.427782 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.427759 2580 generic.go:358] "Generic (PLEG): container finished" podID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerID="e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d" exitCode=137 Apr 24 21:50:13.427870 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.427791 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerDied","Data":"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d"} Apr 24 21:50:13.427870 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.427822 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" Apr 24 21:50:13.427870 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.427830 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr" event={"ID":"44da7e50-aeb4-4fbf-80a1-31f697b4b936","Type":"ContainerDied","Data":"47bf6bb0b796d9b39f8623a2b44ebb6d769d9fdac325a20587cee609098ccb82"} Apr 24 21:50:13.435713 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.435491 2580 scope.go:117] "RemoveContainer" containerID="57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e" Apr 24 21:50:13.451509 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.451478 2580 scope.go:117] "RemoveContainer" containerID="5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" Apr 24 21:50:13.454052 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.454023 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:50:13.457650 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.457614 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-6b794d9b94ck5j7"] Apr 24 21:50:13.460834 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.460812 2580 scope.go:117] "RemoveContainer" containerID="4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" Apr 24 21:50:13.461160 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:50:13.461133 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640\": container with ID starting with 4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640 not found: ID does not exist" containerID="4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" Apr 24 21:50:13.461218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461175 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640"} err="failed to get container status \"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640\": rpc error: code = NotFound desc = could not find container \"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640\": container with ID starting with 4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640 not found: ID does not exist" Apr 24 21:50:13.461218 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461202 2580 scope.go:117] "RemoveContainer" containerID="57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e" Apr 24 21:50:13.461507 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:50:13.461488 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e\": container with ID starting with 57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e not found: ID does not exist" containerID="57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e" Apr 24 21:50:13.461557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461513 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e"} err="failed to get container status \"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e\": rpc error: code = NotFound desc = could not find container \"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e\": container with ID starting with 57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e not found: ID does not exist" Apr 24 21:50:13.461557 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461532 2580 scope.go:117] "RemoveContainer" containerID="5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" Apr 24 21:50:13.461851 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:50:13.461829 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09\": container with ID starting with 5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09 not found: ID does not exist" containerID="5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" Apr 24 21:50:13.461971 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461851 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09"} err="failed to get container status \"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09\": rpc error: code = NotFound desc = could not find container \"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09\": container with ID starting with 5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09 not found: ID does not exist" Apr 24 21:50:13.461971 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.461866 2580 scope.go:117] "RemoveContainer" containerID="4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640" Apr 24 21:50:13.462094 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462073 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640"} err="failed to get container status \"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640\": rpc error: code = NotFound desc = could not find container \"4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640\": container with ID starting with 4f2ec54b74ee62e0c7c1290eb3b46bb977fd2511a4825b9028195dc80eb7c640 not found: ID does not exist" Apr 24 21:50:13.462154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462095 2580 scope.go:117] "RemoveContainer" containerID="57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e" Apr 24 21:50:13.462303 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462283 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e"} err="failed to get container status \"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e\": rpc error: code = NotFound desc = could not find container \"57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e\": container with ID starting with 57dd3adb8f225a0a599c98644e97878fc03667fff713b6c56049bf0226f1370e not found: ID does not exist" Apr 24 21:50:13.462367 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462305 2580 scope.go:117] "RemoveContainer" containerID="5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09" Apr 24 21:50:13.462511 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462494 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09"} err="failed to get container status \"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09\": rpc error: code = NotFound desc = could not find container \"5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09\": container with ID starting with 5a19caf5ac0dce81c4b415c64eb77445c9100c0adbecde921149e5d393f03c09 not found: ID does not exist" Apr 24 21:50:13.462568 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.462511 2580 scope.go:117] "RemoveContainer" containerID="e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d" Apr 24 21:50:13.470832 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.470796 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:50:13.471801 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.471776 2580 scope.go:117] "RemoveContainer" containerID="2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc" Apr 24 21:50:13.473459 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.473413 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-5cnhvzr"] Apr 24 21:50:13.537455 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.537432 2580 scope.go:117] "RemoveContainer" containerID="e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d" Apr 24 21:50:13.537856 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:50:13.537829 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d\": container with ID starting with e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d not found: ID does not exist" containerID="e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d" Apr 24 21:50:13.537961 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.537869 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d"} err="failed to get container status \"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d\": rpc error: code = NotFound desc = could not find container \"e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d\": container with ID starting with e60f1e2464e9b352b65706ec905d57962569df67bb0e83a2242e97653e79d81d not found: ID does not exist" Apr 24 21:50:13.537961 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.537899 2580 scope.go:117] "RemoveContainer" containerID="2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc" Apr 24 21:50:13.538174 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:50:13.538158 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc\": container with ID starting with 2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc not found: ID does not exist" containerID="2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc" Apr 24 21:50:13.538229 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:13.538182 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc"} err="failed to get container status \"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc\": rpc error: code = NotFound desc = could not find container \"2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc\": container with ID starting with 2f1da57a0185c5f615f5badebe40a7bf36f49b4e403b803e978684084d589fdc not found: ID does not exist" Apr 24 21:50:14.578809 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:14.578774 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" path="/var/lib/kubelet/pods/44da7e50-aeb4-4fbf-80a1-31f697b4b936/volumes" Apr 24 21:50:14.579252 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:50:14.579208 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e446da05-1992-4e35-873d-4b41352c46a6" path="/var/lib/kubelet/pods/e446da05-1992-4e35-873d-4b41352c46a6/volumes" Apr 24 21:51:26.554134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:26.554102 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:51:26.558564 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:26.558533 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:51:56.774374 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:56.774289 2580 generic.go:358] "Generic (PLEG): container finished" podID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerID="ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70" exitCode=0 Apr 24 21:51:56.774374 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:56.774361 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerDied","Data":"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70"} Apr 24 21:51:57.779340 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:57.779293 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerStarted","Data":"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150"} Apr 24 21:51:57.803821 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:51:57.803755 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podStartSLOduration=127.803733739 podStartE2EDuration="2m7.803733739s" podCreationTimestamp="2026-04-24 21:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:51:57.799983024 +0000 UTC m=+1531.784197816" watchObservedRunningTime="2026-04-24 21:51:57.803733739 +0000 UTC m=+1531.787948520" Apr 24 21:52:00.652987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:00.652939 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:52:00.652987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:00.652992 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:52:00.654601 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:00.654567 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:10.653273 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:10.653221 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:20.654111 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:20.654062 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:30.653698 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:30.653638 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:39.506796 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:39.506758 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:52:39.507207 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:39.507135 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" podUID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" containerName="storage-initializer" containerID="cri-o://8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5" gracePeriod=30 Apr 24 21:52:40.654171 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:40.654115 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:50.658268 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:52:50.658214 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:53:00.653217 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:00.653163 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:53:09.705406 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.705380 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-inline-config-test-kserve-6f547d47b5-7c75p_7d32ca98-6b4b-442a-99ef-9c7fbda96444/storage-initializer/0.log" Apr 24 21:53:09.705790 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.705447 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:53:09.737879 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.737842 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.737879 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.737887 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.737912 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.737943 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738016 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738062 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzpl5\" (UniqueName: \"kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738330 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738122 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs\") pod \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\" (UID: \"7d32ca98-6b4b-442a-99ef-9c7fbda96444\") " Apr 24 21:53:09.738330 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738158 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home" (OuterVolumeSpecName: "home") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:09.738330 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738248 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:09.738330 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738307 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache" (OuterVolumeSpecName: "model-cache") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:09.738530 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738462 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.738530 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738481 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.738530 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.738497 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.740439 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.740405 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm" (OuterVolumeSpecName: "dshm") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:09.740581 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.740554 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5" (OuterVolumeSpecName: "kube-api-access-fzpl5") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "kube-api-access-fzpl5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:53:09.740833 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.740815 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:53:09.792394 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.792290 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d32ca98-6b4b-442a-99ef-9c7fbda96444" (UID: "7d32ca98-6b4b-442a-99ef-9c7fbda96444"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:53:09.839450 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.839410 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.839450 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.839442 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fzpl5\" (UniqueName: \"kubernetes.io/projected/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kube-api-access-fzpl5\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.839616 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.839459 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d32ca98-6b4b-442a-99ef-9c7fbda96444-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:09.839616 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:09.839473 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d32ca98-6b4b-442a-99ef-9c7fbda96444-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:53:10.027119 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027088 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-inline-config-test-kserve-6f547d47b5-7c75p_7d32ca98-6b4b-442a-99ef-9c7fbda96444/storage-initializer/0.log" Apr 24 21:53:10.027283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027137 2580 generic.go:358] "Generic (PLEG): container finished" podID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" containerID="8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5" exitCode=137 Apr 24 21:53:10.027283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027187 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" event={"ID":"7d32ca98-6b4b-442a-99ef-9c7fbda96444","Type":"ContainerDied","Data":"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5"} Apr 24 21:53:10.027283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027218 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" event={"ID":"7d32ca98-6b4b-442a-99ef-9c7fbda96444","Type":"ContainerDied","Data":"64b55e7043b72f3c0676be53c2e353fd36486757824a1a4e26326ef68343b663"} Apr 24 21:53:10.027283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027218 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p" Apr 24 21:53:10.027283 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.027237 2580 scope.go:117] "RemoveContainer" containerID="8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5" Apr 24 21:53:10.046774 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.046752 2580 scope.go:117] "RemoveContainer" containerID="8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5" Apr 24 21:53:10.047092 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:53:10.047070 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5\": container with ID starting with 8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5 not found: ID does not exist" containerID="8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5" Apr 24 21:53:10.047170 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.047101 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5"} err="failed to get container status \"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5\": rpc error: code = NotFound desc = could not find container \"8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5\": container with ID starting with 8218c2c38fb41df5261b943394dc0c2a0878ef1fc3daf60ebdc1453717c0fba5 not found: ID does not exist" Apr 24 21:53:10.063744 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.063703 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:53:10.068777 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.068740 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6f547d47b5-7c75p"] Apr 24 21:53:10.580473 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.580434 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" path="/var/lib/kubelet/pods/7d32ca98-6b4b-442a-99ef-9c7fbda96444/volumes" Apr 24 21:53:10.653536 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:10.653483 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:53:20.653703 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:20.653584 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:53:30.653752 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:30.653698 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:53:40.664168 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:40.664125 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:53:40.672775 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:53:40.672744 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 21:54:30.127061 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127022 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127531 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="storage-initializer" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127552 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="storage-initializer" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127566 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127574 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127589 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127598 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127618 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" containerName="storage-initializer" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127626 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" containerName="storage-initializer" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127635 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="storage-initializer" Apr 24 21:54:30.127644 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127643 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="storage-initializer" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127685 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="llm-d-routing-sidecar" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127694 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="llm-d-routing-sidecar" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127782 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d32ca98-6b4b-442a-99ef-9c7fbda96444" containerName="storage-initializer" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127797 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="llm-d-routing-sidecar" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127807 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="44da7e50-aeb4-4fbf-80a1-31f697b4b936" containerName="main" Apr 24 21:54:30.128174 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.127821 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e446da05-1992-4e35-873d-4b41352c46a6" containerName="main" Apr 24 21:54:30.131413 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.131389 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.135957 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.135935 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:54:30.136189 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.136175 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-8xhtd\"" Apr 24 21:54:30.148643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.148614 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:54:30.184793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184753 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.185001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184809 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.185001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.185001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184888 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z5zw\" (UniqueName: \"kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.185001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.185001 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.184964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.285857 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.285818 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.285857 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.285857 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286118 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.285949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286118 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286118 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286045 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286118 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z5zw\" (UniqueName: \"kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286365 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286467 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286394 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.286467 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.286418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.288935 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.288901 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.298607 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.298571 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z5zw\" (UniqueName: \"kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.440936 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.440891 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:54:30.588936 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:30.588902 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:54:30.591755 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:54:30.591724 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fa6aed3_5081_4963_aa75_22bc75d4246b.slice/crio-c31e1c94b3e2f9b001f3c6ddcfe63656e011bc924c8121587d67e5cbbb488675 WatchSource:0}: Error finding container c31e1c94b3e2f9b001f3c6ddcfe63656e011bc924c8121587d67e5cbbb488675: Status 404 returned error can't find the container with id c31e1c94b3e2f9b001f3c6ddcfe63656e011bc924c8121587d67e5cbbb488675 Apr 24 21:54:31.298012 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:31.297970 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerStarted","Data":"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689"} Apr 24 21:54:31.298012 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:31.298013 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerStarted","Data":"c31e1c94b3e2f9b001f3c6ddcfe63656e011bc924c8121587d67e5cbbb488675"} Apr 24 21:54:32.302767 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:32.302728 2580 generic.go:358] "Generic (PLEG): container finished" podID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerID="bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689" exitCode=0 Apr 24 21:54:32.303142 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:32.302810 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerDied","Data":"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689"} Apr 24 21:54:33.307484 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:54:33.307434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerStarted","Data":"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2"} Apr 24 21:55:04.442455 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:04.442413 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerStarted","Data":"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1"} Apr 24 21:55:04.442986 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:04.442771 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:55:04.445297 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:04.445269 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:04.467810 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:04.467754 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podStartSLOduration=2.864780088 podStartE2EDuration="34.467737595s" podCreationTimestamp="2026-04-24 21:54:30 +0000 UTC" firstStartedPulling="2026-04-24 21:54:32.303874242 +0000 UTC m=+1686.288089002" lastFinishedPulling="2026-04-24 21:55:03.906831749 +0000 UTC m=+1717.891046509" observedRunningTime="2026-04-24 21:55:04.465715903 +0000 UTC m=+1718.449930683" watchObservedRunningTime="2026-04-24 21:55:04.467737595 +0000 UTC m=+1718.451952373" Apr 24 21:55:05.447130 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:05.447091 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:10.441324 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:10.441278 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:55:10.441842 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:10.441340 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:55:10.441842 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:10.441741 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.37:8082/healthz\": dial tcp 10.132.0.37:8082: connect: connection refused" Apr 24 21:55:10.443019 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:10.442994 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:20.443199 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:20.443149 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:20.443675 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:20.443570 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:55:20.444967 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:20.444915 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:20.445081 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:20.444946 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:55:20.496379 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:20.496342 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:30.497006 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:30.496962 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:40.497048 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:40.497001 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:55:50.497415 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:55:50.497374 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:56:00.497375 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:00.497332 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:56:10.496987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:10.496945 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:56:11.977222 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:11.977140 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:56:11.977693 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:11.977499 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" containerID="cri-o://c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2" gracePeriod=30 Apr 24 21:56:11.977693 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:11.977564 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="tokenizer" containerID="cri-o://ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1" gracePeriod=30 Apr 24 21:56:11.979028 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:11.978982 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:56:12.665875 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:12.665840 2580 generic.go:358] "Generic (PLEG): container finished" podID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerID="c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2" exitCode=0 Apr 24 21:56:12.666050 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:12.665916 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerDied","Data":"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2"} Apr 24 21:56:13.328233 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.328206 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:56:13.442008 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.441917 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442008 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.441954 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442008 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.441996 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442300 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442028 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z5zw\" (UniqueName: \"kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442300 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442055 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442300 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442126 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp\") pod \"9fa6aed3-5081-4963-aa75-22bc75d4246b\" (UID: \"9fa6aed3-5081-4963-aa75-22bc75d4246b\") " Apr 24 21:56:13.442300 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442269 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.442464 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442308 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.442464 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442360 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.442559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442532 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.443033 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.442943 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:13.444453 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.444429 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw" (OuterVolumeSpecName: "kube-api-access-6z5zw") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "kube-api-access-6z5zw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:13.444544 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.444463 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9fa6aed3-5081-4963-aa75-22bc75d4246b" (UID: "9fa6aed3-5081-4963-aa75-22bc75d4246b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:13.543090 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.543039 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-tmp\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.543090 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.543087 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.543090 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.543099 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fa6aed3-5081-4963-aa75-22bc75d4246b-tokenizer-uds\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.543090 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.543108 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6z5zw\" (UniqueName: \"kubernetes.io/projected/9fa6aed3-5081-4963-aa75-22bc75d4246b-kube-api-access-6z5zw\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.543368 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.543119 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fa6aed3-5081-4963-aa75-22bc75d4246b-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:56:13.670681 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.670629 2580 generic.go:358] "Generic (PLEG): container finished" podID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerID="ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1" exitCode=0 Apr 24 21:56:13.670846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.670699 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerDied","Data":"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1"} Apr 24 21:56:13.670846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.670733 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" Apr 24 21:56:13.670846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.670750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp" event={"ID":"9fa6aed3-5081-4963-aa75-22bc75d4246b","Type":"ContainerDied","Data":"c31e1c94b3e2f9b001f3c6ddcfe63656e011bc924c8121587d67e5cbbb488675"} Apr 24 21:56:13.670846 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.670765 2580 scope.go:117] "RemoveContainer" containerID="ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1" Apr 24 21:56:13.679547 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.679527 2580 scope.go:117] "RemoveContainer" containerID="c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2" Apr 24 21:56:13.687225 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.687208 2580 scope.go:117] "RemoveContainer" containerID="bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689" Apr 24 21:56:13.694715 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.694643 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:56:13.695687 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.695646 2580 scope.go:117] "RemoveContainer" containerID="ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1" Apr 24 21:56:13.695934 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:56:13.695916 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1\": container with ID starting with ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1 not found: ID does not exist" containerID="ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1" Apr 24 21:56:13.695989 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.695946 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1"} err="failed to get container status \"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1\": rpc error: code = NotFound desc = could not find container \"ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1\": container with ID starting with ee5bb9d62dd1cc44edcb537221387d3ba2cfcbcb66af3f67fb8994c14ae78fe1 not found: ID does not exist" Apr 24 21:56:13.695989 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.695965 2580 scope.go:117] "RemoveContainer" containerID="c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2" Apr 24 21:56:13.696189 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:56:13.696170 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2\": container with ID starting with c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2 not found: ID does not exist" containerID="c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2" Apr 24 21:56:13.696237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.696193 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2"} err="failed to get container status \"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2\": rpc error: code = NotFound desc = could not find container \"c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2\": container with ID starting with c0340ac59522978b690e74b60a161456f8413be9e197b67910443dc5a7da99c2 not found: ID does not exist" Apr 24 21:56:13.696237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.696206 2580 scope.go:117] "RemoveContainer" containerID="bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689" Apr 24 21:56:13.696468 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:56:13.696448 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689\": container with ID starting with bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689 not found: ID does not exist" containerID="bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689" Apr 24 21:56:13.696560 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.696472 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689"} err="failed to get container status \"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689\": rpc error: code = NotFound desc = could not find container \"bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689\": container with ID starting with bf557fbf62dae2c6c169b9e933e141e5a93f868178d6cce412082b619a04c689 not found: ID does not exist" Apr 24 21:56:13.698613 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:13.698590 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6897fdcmnthp"] Apr 24 21:56:14.577533 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:14.577500 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" path="/var/lib/kubelet/pods/9fa6aed3-5081-4963-aa75-22bc75d4246b/volumes" Apr 24 21:56:22.054286 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054251 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054560 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054571 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054584 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="tokenizer" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054589 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="tokenizer" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054601 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="storage-initializer" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054607 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="storage-initializer" Apr 24 21:56:22.054691 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054691 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="main" Apr 24 21:56:22.054903 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.054700 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fa6aed3-5081-4963-aa75-22bc75d4246b" containerName="tokenizer" Apr 24 21:56:22.057686 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.057649 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.060765 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.060737 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 24 21:56:22.076134 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.076100 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:56:22.115050 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115017 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115050 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115054 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115085 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krvt7\" (UniqueName: \"kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115158 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.115281 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.115277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216603 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216643 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krvt7\" (UniqueName: \"kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216915 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216722 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216915 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216762 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216915 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216792 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216915 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.216915 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.216848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.217237 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.217203 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.217350 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.217235 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.217350 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.217300 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.217427 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.217371 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.219315 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.219295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.219685 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.219645 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.226974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.226943 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krvt7\" (UniqueName: \"kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7\") pod \"precise-prefix-cache-test-kserve-7c59dbc947-5wt6t\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.320191 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.320104 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:56:22.329850 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.329808 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.332366 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.332339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-z7wg4\"" Apr 24 21:56:22.336436 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.336409 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:56:22.369149 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.369110 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:22.418447 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.418568 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418494 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.418568 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ttmz\" (UniqueName: \"kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.418700 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418586 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.418700 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.418798 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.418719 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.500191 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.500159 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:56:22.502808 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:56:22.502772 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210d5fca_21c3_4080_9382_346506cee9c4.slice/crio-5f8de73afdb6081b9d1cc4053aec6940f84d4ccc27c55241be38f44f5532935f WatchSource:0}: Error finding container 5f8de73afdb6081b9d1cc4053aec6940f84d4ccc27c55241be38f44f5532935f: Status 404 returned error can't find the container with id 5f8de73afdb6081b9d1cc4053aec6940f84d4ccc27c55241be38f44f5532935f Apr 24 21:56:22.504528 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.504510 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:56:22.519186 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519299 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519215 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519359 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ttmz\" (UniqueName: \"kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519460 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519619 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519712 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519619 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519712 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519587 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519809 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519738 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.519867 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.519847 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.522573 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.522551 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.528018 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.527996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ttmz\" (UniqueName: \"kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.643175 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.643070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:22.707103 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.707063 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerStarted","Data":"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7"} Apr 24 21:56:22.707103 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.707110 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerStarted","Data":"5f8de73afdb6081b9d1cc4053aec6940f84d4ccc27c55241be38f44f5532935f"} Apr 24 21:56:22.789801 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:22.789749 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:56:22.792896 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:56:22.792853 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33db2c1b_2825_4704_926e_6951ba058c4c.slice/crio-564faef5d47cf7293a08dbfe4ec842c159a7b39a034f59150a943cb8aacc76f4 WatchSource:0}: Error finding container 564faef5d47cf7293a08dbfe4ec842c159a7b39a034f59150a943cb8aacc76f4: Status 404 returned error can't find the container with id 564faef5d47cf7293a08dbfe4ec842c159a7b39a034f59150a943cb8aacc76f4 Apr 24 21:56:23.713960 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:23.713920 2580 generic.go:358] "Generic (PLEG): container finished" podID="33db2c1b-2825-4704-926e-6951ba058c4c" containerID="f3d1a8dccca9b30175d584d4cb718d8511f3ec4b1c633df4c5d6ddc26ab77527" exitCode=0 Apr 24 21:56:23.714435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:23.714008 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerDied","Data":"f3d1a8dccca9b30175d584d4cb718d8511f3ec4b1c633df4c5d6ddc26ab77527"} Apr 24 21:56:23.714435 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:23.714050 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerStarted","Data":"564faef5d47cf7293a08dbfe4ec842c159a7b39a034f59150a943cb8aacc76f4"} Apr 24 21:56:24.721580 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:24.721542 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerStarted","Data":"8de465236bdcc9464ead7a1c3a99cd2f4454bd71ae3f9c6c1db809c709f50391"} Apr 24 21:56:24.721580 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:24.721584 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerStarted","Data":"aeb84c05662470634e21157428aaacf57dd4affbeb9381bb754c60ebcc962dc6"} Apr 24 21:56:24.722131 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:24.721704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:24.748522 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:24.748428 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" podStartSLOduration=2.748407951 podStartE2EDuration="2.748407951s" podCreationTimestamp="2026-04-24 21:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:24.744365857 +0000 UTC m=+1798.728580637" watchObservedRunningTime="2026-04-24 21:56:24.748407951 +0000 UTC m=+1798.732622732" Apr 24 21:56:26.584165 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:26.584134 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:56:26.588245 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:26.588222 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 21:56:26.730507 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:26.730464 2580 generic.go:358] "Generic (PLEG): container finished" podID="210d5fca-21c3-4080-9382-346506cee9c4" containerID="edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7" exitCode=0 Apr 24 21:56:26.730688 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:26.730534 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerDied","Data":"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7"} Apr 24 21:56:28.740304 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:28.740265 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerStarted","Data":"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce"} Apr 24 21:56:28.776350 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:28.776297 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" podStartSLOduration=5.698805813 podStartE2EDuration="6.776278803s" podCreationTimestamp="2026-04-24 21:56:22 +0000 UTC" firstStartedPulling="2026-04-24 21:56:26.731786097 +0000 UTC m=+1800.716000854" lastFinishedPulling="2026-04-24 21:56:27.809259085 +0000 UTC m=+1801.793473844" observedRunningTime="2026-04-24 21:56:28.775939764 +0000 UTC m=+1802.760154544" watchObservedRunningTime="2026-04-24 21:56:28.776278803 +0000 UTC m=+1802.760493582" Apr 24 21:56:32.369600 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.369562 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:32.370254 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.369651 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:32.382408 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.382376 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:32.643799 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.643704 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:32.643799 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.643753 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:32.644883 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:56:32.644855 2580 logging.go:55] [core] [Channel #74 SubChannel #75]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.39:9003", ServerName: "10.132.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.39:9003: connect: connection refused" Apr 24 21:56:32.646201 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.646177 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:32.755498 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.755465 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:32.765932 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:32.765900 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:56:33.643941 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:33.643891 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.39:9003\" within 1s: context deadline exceeded" Apr 24 21:56:34.764411 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:34.764383 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5_33db2c1b-2825-4704-926e-6951ba058c4c/main/0.log" Apr 24 21:56:34.764844 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:34.764757 2580 generic.go:358] "Generic (PLEG): container finished" podID="33db2c1b-2825-4704-926e-6951ba058c4c" containerID="aeb84c05662470634e21157428aaacf57dd4affbeb9381bb754c60ebcc962dc6" exitCode=1 Apr 24 21:56:34.764891 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:34.764844 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerDied","Data":"aeb84c05662470634e21157428aaacf57dd4affbeb9381bb754c60ebcc962dc6"} Apr 24 21:56:34.765412 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:34.765388 2580 scope.go:117] "RemoveContainer" containerID="aeb84c05662470634e21157428aaacf57dd4affbeb9381bb754c60ebcc962dc6" Apr 24 21:56:35.770404 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:35.770372 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5_33db2c1b-2825-4704-926e-6951ba058c4c/main/0.log" Apr 24 21:56:35.770928 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:35.770804 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerStarted","Data":"49d9de0bb724024de1618357a73f158a3b1bb172190bdb57fa829cc9641aabf7"} Apr 24 21:56:35.771121 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:35.771088 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:56:42.644392 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:56:42.644351 2580 logging.go:55] [core] [Channel #76 SubChannel #77]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.39:9003", ServerName: "10.132.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.39:9003: connect: connection refused" Apr 24 21:56:43.644045 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:56:43.643991 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.39:9003\" within 1s: context deadline exceeded" Apr 24 21:57:06.775942 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:06.775909 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:57:08.313904 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.313866 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:57:08.314297 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.314160 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="main" containerID="cri-o://7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce" gracePeriod=30 Apr 24 21:57:08.324152 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.324120 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:57:08.324497 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.324440 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="tokenizer" containerID="cri-o://8de465236bdcc9464ead7a1c3a99cd2f4454bd71ae3f9c6c1db809c709f50391" gracePeriod=30 Apr 24 21:57:08.324599 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.324470 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" containerID="cri-o://49d9de0bb724024de1618357a73f158a3b1bb172190bdb57fa829cc9641aabf7" gracePeriod=30 Apr 24 21:57:08.568793 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.568633 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:57:08.632442 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632406 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632609 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632477 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632609 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632536 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krvt7\" (UniqueName: \"kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632609 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632575 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632798 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632611 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632798 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632640 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.632798 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632715 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs\") pod \"210d5fca-21c3-4080-9382-346506cee9c4\" (UID: \"210d5fca-21c3-4080-9382-346506cee9c4\") " Apr 24 21:57:08.633008 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.632705 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.633191 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.633163 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache" (OuterVolumeSpecName: "model-cache") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.633577 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.633535 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home" (OuterVolumeSpecName: "home") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.635050 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.635020 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm" (OuterVolumeSpecName: "dshm") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.635394 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.635350 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:08.635547 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.635528 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7" (OuterVolumeSpecName: "kube-api-access-krvt7") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "kube-api-access-krvt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:08.700821 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.700765 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "210d5fca-21c3-4080-9382-346506cee9c4" (UID: "210d5fca-21c3-4080-9382-346506cee9c4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:08.734418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734379 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/210d5fca-21c3-4080-9382-346506cee9c4-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734409 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734417 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734418 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734427 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krvt7\" (UniqueName: \"kubernetes.io/projected/210d5fca-21c3-4080-9382-346506cee9c4-kube-api-access-krvt7\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734778 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734436 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734778 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734445 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.734778 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.734466 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/210d5fca-21c3-4080-9382-346506cee9c4-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:08.883624 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.883530 2580 generic.go:358] "Generic (PLEG): container finished" podID="210d5fca-21c3-4080-9382-346506cee9c4" containerID="7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce" exitCode=0 Apr 24 21:57:08.883624 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.883583 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerDied","Data":"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce"} Apr 24 21:57:08.883624 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.883603 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" Apr 24 21:57:08.883900 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.883631 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t" event={"ID":"210d5fca-21c3-4080-9382-346506cee9c4","Type":"ContainerDied","Data":"5f8de73afdb6081b9d1cc4053aec6940f84d4ccc27c55241be38f44f5532935f"} Apr 24 21:57:08.883900 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.883652 2580 scope.go:117] "RemoveContainer" containerID="7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce" Apr 24 21:57:08.886314 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.886291 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5_33db2c1b-2825-4704-926e-6951ba058c4c/main/0.log" Apr 24 21:57:08.888676 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.888625 2580 generic.go:358] "Generic (PLEG): container finished" podID="33db2c1b-2825-4704-926e-6951ba058c4c" containerID="49d9de0bb724024de1618357a73f158a3b1bb172190bdb57fa829cc9641aabf7" exitCode=0 Apr 24 21:57:08.888809 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.888735 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerDied","Data":"49d9de0bb724024de1618357a73f158a3b1bb172190bdb57fa829cc9641aabf7"} Apr 24 21:57:08.895015 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.894998 2580 scope.go:117] "RemoveContainer" containerID="edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7" Apr 24 21:57:08.905987 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.905967 2580 scope.go:117] "RemoveContainer" containerID="7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce" Apr 24 21:57:08.906291 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:57:08.906273 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce\": container with ID starting with 7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce not found: ID does not exist" containerID="7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce" Apr 24 21:57:08.906361 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.906299 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce"} err="failed to get container status \"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce\": rpc error: code = NotFound desc = could not find container \"7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce\": container with ID starting with 7fe39e455ec9070e63038bb3ba5650437fab7c7fc2fbf46457489343d054ffce not found: ID does not exist" Apr 24 21:57:08.906361 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.906316 2580 scope.go:117] "RemoveContainer" containerID="edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7" Apr 24 21:57:08.906521 ip-10-0-131-237 kubenswrapper[2580]: E0424 21:57:08.906503 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7\": container with ID starting with edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7 not found: ID does not exist" containerID="edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7" Apr 24 21:57:08.906559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.906525 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7"} err="failed to get container status \"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7\": rpc error: code = NotFound desc = could not find container \"edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7\": container with ID starting with edcd5afb2be23624b97a66d93cc0e62332ec2bd5ce93178ca5b14f37f00a6de7 not found: ID does not exist" Apr 24 21:57:08.906559 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.906538 2580 scope.go:117] "RemoveContainer" containerID="aeb84c05662470634e21157428aaacf57dd4affbeb9381bb754c60ebcc962dc6" Apr 24 21:57:08.910499 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.910471 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:57:08.914526 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:08.914502 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7c59dbc947-5wt6t"] Apr 24 21:57:09.896037 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:09.896002 2580 generic.go:358] "Generic (PLEG): container finished" podID="33db2c1b-2825-4704-926e-6951ba058c4c" containerID="8de465236bdcc9464ead7a1c3a99cd2f4454bd71ae3f9c6c1db809c709f50391" exitCode=0 Apr 24 21:57:09.896419 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:09.896081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerDied","Data":"8de465236bdcc9464ead7a1c3a99cd2f4454bd71ae3f9c6c1db809c709f50391"} Apr 24 21:57:09.971055 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:09.971032 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:57:10.048856 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048766 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ttmz\" (UniqueName: \"kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.048856 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048810 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.048856 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048854 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.049154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048882 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.049154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048918 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.049154 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.048937 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache\") pod \"33db2c1b-2825-4704-926e-6951ba058c4c\" (UID: \"33db2c1b-2825-4704-926e-6951ba058c4c\") " Apr 24 21:57:10.049315 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.049241 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:10.049315 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.049273 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:10.049430 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.049404 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:10.049758 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.049733 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:10.051248 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.051229 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz" (OuterVolumeSpecName: "kube-api-access-9ttmz") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "kube-api-access-9ttmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:10.051309 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.051262 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "33db2c1b-2825-4704-926e-6951ba058c4c" (UID: "33db2c1b-2825-4704-926e-6951ba058c4c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:10.150187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150153 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9ttmz\" (UniqueName: \"kubernetes.io/projected/33db2c1b-2825-4704-926e-6951ba058c4c-kube-api-access-9ttmz\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.150187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150185 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-tmp\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.150187 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150198 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/33db2c1b-2825-4704-926e-6951ba058c4c-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.150432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150206 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.150432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150217 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-uds\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.150432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.150228 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/33db2c1b-2825-4704-926e-6951ba058c4c-tokenizer-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 21:57:10.578036 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.578002 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d5fca-21c3-4080-9382-346506cee9c4" path="/var/lib/kubelet/pods/210d5fca-21c3-4080-9382-346506cee9c4/volumes" Apr 24 21:57:10.901462 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.901437 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" Apr 24 21:57:10.901884 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.901435 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5" event={"ID":"33db2c1b-2825-4704-926e-6951ba058c4c","Type":"ContainerDied","Data":"564faef5d47cf7293a08dbfe4ec842c159a7b39a034f59150a943cb8aacc76f4"} Apr 24 21:57:10.901884 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.901561 2580 scope.go:117] "RemoveContainer" containerID="49d9de0bb724024de1618357a73f158a3b1bb172190bdb57fa829cc9641aabf7" Apr 24 21:57:10.910352 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.910332 2580 scope.go:117] "RemoveContainer" containerID="8de465236bdcc9464ead7a1c3a99cd2f4454bd71ae3f9c6c1db809c709f50391" Apr 24 21:57:10.918146 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.918124 2580 scope.go:117] "RemoveContainer" containerID="f3d1a8dccca9b30175d584d4cb718d8511f3ec4b1c633df4c5d6ddc26ab77527" Apr 24 21:57:10.921190 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.921166 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:57:10.928302 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:10.928279 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6958b8b568dt5"] Apr 24 21:57:12.579493 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:12.579442 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" path="/var/lib/kubelet/pods/33db2c1b-2825-4704-926e-6951ba058c4c/volumes" Apr 24 21:57:16.498394 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498358 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 21:57:16.498872 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498850 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="storage-initializer" Apr 24 21:57:16.498934 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498878 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="storage-initializer" Apr 24 21:57:16.498934 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498892 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="tokenizer" Apr 24 21:57:16.498934 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498915 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="tokenizer" Apr 24 21:57:16.498934 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498928 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498938 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498957 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498965 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498974 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498981 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="main" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.498992 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="storage-initializer" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.499000 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="storage-initializer" Apr 24 21:57:16.499099 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.499100 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="tokenizer" Apr 24 21:57:16.499387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.499111 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.499387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.499125 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="210d5fca-21c3-4080-9382-346506cee9c4" containerName="main" Apr 24 21:57:16.499387 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.499289 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="33db2c1b-2825-4704-926e-6951ba058c4c" containerName="main" Apr 24 21:57:16.504349 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.504321 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.506890 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.506869 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:57:16.515560 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.515533 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 21:57:16.609255 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609268 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2b8\" (UniqueName: \"kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609338 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609372 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609393 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.609451 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.609448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.710791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.710791 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710850 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711058 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.710942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2b8\" (UniqueName: \"kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711529 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.711398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711529 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.711450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711529 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.711493 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.711772 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.711545 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.713570 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.713536 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.713836 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.713816 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.719027 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.719001 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2b8\" (UniqueName: \"kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8\") pod \"stop-feature-test-kserve-6f89b754f-kqf9p\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.817956 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.817855 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:57:16.954924 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:16.954888 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 21:57:16.958193 ip-10-0-131-237 kubenswrapper[2580]: W0424 21:57:16.958155 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d743c6_d0ed_4314_971f_3549b9056af0.slice/crio-c4f618336c32207409387b8827060182c30e559ab5e86a7d4aff8ea0b9532b2e WatchSource:0}: Error finding container c4f618336c32207409387b8827060182c30e559ab5e86a7d4aff8ea0b9532b2e: Status 404 returned error can't find the container with id c4f618336c32207409387b8827060182c30e559ab5e86a7d4aff8ea0b9532b2e Apr 24 21:57:17.930423 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:17.930384 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerStarted","Data":"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265"} Apr 24 21:57:17.930423 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:57:17.930428 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerStarted","Data":"c4f618336c32207409387b8827060182c30e559ab5e86a7d4aff8ea0b9532b2e"} Apr 24 21:59:11.327432 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:11.327342 2580 generic.go:358] "Generic (PLEG): container finished" podID="79d743c6-d0ed-4314-971f-3549b9056af0" containerID="cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265" exitCode=0 Apr 24 21:59:11.327974 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:11.327421 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerDied","Data":"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265"} Apr 24 21:59:12.333004 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:12.332961 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerStarted","Data":"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b"} Apr 24 21:59:12.358189 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:12.358137 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podStartSLOduration=116.358124539 podStartE2EDuration="1m56.358124539s" podCreationTimestamp="2026-04-24 21:57:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:59:12.357006989 +0000 UTC m=+1966.341221778" watchObservedRunningTime="2026-04-24 21:59:12.358124539 +0000 UTC m=+1966.342339317" Apr 24 21:59:16.818956 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:16.818912 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:59:16.818956 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:16.818962 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 21:59:16.820829 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:16.820795 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 21:59:26.819298 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:26.819251 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 21:59:36.819068 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:36.819021 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 21:59:46.818737 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:46.818684 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 21:59:56.818470 ip-10-0-131-237 kubenswrapper[2580]: I0424 21:59:56.818413 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:06.819191 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:06.819140 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:16.818740 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:16.818693 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:26.818773 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:26.818721 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:36.819189 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:36.819129 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:46.818648 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:46.818552 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:00:56.829000 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:56.828964 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 22:00:56.837057 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:56.837031 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 22:00:57.956884 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:57.956850 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 22:00:58.701790 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:00:58.701751 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" containerID="cri-o://826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b" gracePeriod=30 Apr 24 22:01:13.597123 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.597089 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:01:13.600743 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.600724 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.613275 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.613241 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:01:13.731941 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.731907 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.731941 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.731944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvjsn\" (UniqueName: \"kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.732175 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.731977 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.732175 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.732046 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.732175 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.732096 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.732175 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.732125 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.732311 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.732200 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.833687 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.833624 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.833902 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.833710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvjsn\" (UniqueName: \"kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.833902 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.833746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.833902 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.833788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834058 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.833905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834114 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834171 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834146 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834262 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834359 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834337 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834432 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.834486 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.834418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.836298 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.836275 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.836473 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.836455 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.852693 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.852616 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvjsn\" (UniqueName: \"kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn\") pod \"stop-feature-test-kserve-6f89b754f-6qfzv\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:13.910908 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:13.910874 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:01:14.043870 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:14.043844 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:01:14.047875 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:01:14.047834 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3142eb2a_18c7_4142_a559_62b52c768f20.slice/crio-7d79d096fa166e4aed0c69163710847034b034b3c907f9ea07873c4b259e28ab WatchSource:0}: Error finding container 7d79d096fa166e4aed0c69163710847034b034b3c907f9ea07873c4b259e28ab: Status 404 returned error can't find the container with id 7d79d096fa166e4aed0c69163710847034b034b3c907f9ea07873c4b259e28ab Apr 24 22:01:14.755636 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:14.755596 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerStarted","Data":"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e"} Apr 24 22:01:14.755636 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:14.755637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerStarted","Data":"7d79d096fa166e4aed0c69163710847034b034b3c907f9ea07873c4b259e28ab"} Apr 24 22:01:26.607343 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:26.607310 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:01:26.612760 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:26.612731 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:01:28.969623 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:28.969598 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6f89b754f-kqf9p_79d743c6-d0ed-4314-971f-3549b9056af0/main/0.log" Apr 24 22:01:28.970066 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:28.970050 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 22:01:29.078670 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078615 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z2b8\" (UniqueName: \"kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.078939 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078697 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.078939 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078744 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.078939 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078759 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.078939 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078789 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.078939 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078826 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.079261 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.078948 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm\") pod \"79d743c6-d0ed-4314-971f-3549b9056af0\" (UID: \"79d743c6-d0ed-4314-971f-3549b9056af0\") " Apr 24 22:01:29.079261 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.079050 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache" (OuterVolumeSpecName: "model-cache") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:29.079397 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.079279 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.079637 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.079588 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home" (OuterVolumeSpecName: "home") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:29.081254 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.081176 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8" (OuterVolumeSpecName: "kube-api-access-2z2b8") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "kube-api-access-2z2b8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:01:29.081366 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.081290 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm" (OuterVolumeSpecName: "dshm") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:29.081439 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.081428 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:01:29.092428 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.092386 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:29.138702 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.138635 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79d743c6-d0ed-4314-971f-3549b9056af0" (UID: "79d743c6-d0ed-4314-971f-3549b9056af0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:01:29.180545 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180509 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2z2b8\" (UniqueName: \"kubernetes.io/projected/79d743c6-d0ed-4314-971f-3549b9056af0-kube-api-access-2z2b8\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.180545 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180538 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.180545 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180549 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.180811 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180560 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.180811 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180570 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/79d743c6-d0ed-4314-971f-3549b9056af0-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.180811 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.180578 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/79d743c6-d0ed-4314-971f-3549b9056af0-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:01:29.807351 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807325 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6f89b754f-kqf9p_79d743c6-d0ed-4314-971f-3549b9056af0/main/0.log" Apr 24 22:01:29.807723 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807700 2580 generic.go:358] "Generic (PLEG): container finished" podID="79d743c6-d0ed-4314-971f-3549b9056af0" containerID="826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b" exitCode=137 Apr 24 22:01:29.807823 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807775 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" Apr 24 22:01:29.807823 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807781 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerDied","Data":"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b"} Apr 24 22:01:29.807823 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807822 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p" event={"ID":"79d743c6-d0ed-4314-971f-3549b9056af0","Type":"ContainerDied","Data":"c4f618336c32207409387b8827060182c30e559ab5e86a7d4aff8ea0b9532b2e"} Apr 24 22:01:29.807936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.807837 2580 scope.go:117] "RemoveContainer" containerID="826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b" Apr 24 22:01:29.817153 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.817132 2580 scope.go:117] "RemoveContainer" containerID="cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265" Apr 24 22:01:29.826880 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.826859 2580 scope.go:117] "RemoveContainer" containerID="826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b" Apr 24 22:01:29.827172 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:01:29.827152 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b\": container with ID starting with 826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b not found: ID does not exist" containerID="826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b" Apr 24 22:01:29.827212 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.827183 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b"} err="failed to get container status \"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b\": rpc error: code = NotFound desc = could not find container \"826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b\": container with ID starting with 826dfaafc4618a3b6642bce14385efdf92f07cc8b5a193ae241624251694127b not found: ID does not exist" Apr 24 22:01:29.827212 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.827204 2580 scope.go:117] "RemoveContainer" containerID="cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265" Apr 24 22:01:29.827478 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:01:29.827455 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265\": container with ID starting with cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265 not found: ID does not exist" containerID="cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265" Apr 24 22:01:29.827547 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.827487 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265"} err="failed to get container status \"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265\": rpc error: code = NotFound desc = could not find container \"cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265\": container with ID starting with cbe09185835cf346dea1732a64fec90382c99a3376dd438008f4c98688497265 not found: ID does not exist" Apr 24 22:01:29.831139 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.831107 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 22:01:29.838800 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:29.838771 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-kqf9p"] Apr 24 22:01:30.577588 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:01:30.577555 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" path="/var/lib/kubelet/pods/79d743c6-d0ed-4314-971f-3549b9056af0/volumes" Apr 24 22:03:23.184000 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:23.183963 2580 generic.go:358] "Generic (PLEG): container finished" podID="3142eb2a-18c7-4142-a559-62b52c768f20" containerID="335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e" exitCode=0 Apr 24 22:03:23.184454 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:23.184036 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerDied","Data":"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e"} Apr 24 22:03:23.185191 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:23.185176 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:03:24.189767 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:24.189724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerStarted","Data":"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6"} Apr 24 22:03:24.212821 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:24.212763 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podStartSLOduration=131.212742902 podStartE2EDuration="2m11.212742902s" podCreationTimestamp="2026-04-24 22:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:03:24.209900486 +0000 UTC m=+2218.194115265" watchObservedRunningTime="2026-04-24 22:03:24.212742902 +0000 UTC m=+2218.196957684" Apr 24 22:03:33.911686 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:33.911631 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:03:33.912194 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:33.911727 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:03:33.913320 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:33.913289 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:03:43.912188 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:43.912084 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:03:53.912218 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:03:53.912176 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:03.911621 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:03.911558 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:13.911548 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:13.911492 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:23.911554 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:23.911507 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:33.911794 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:33.911745 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:43.911673 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:43.911606 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:04:45.072977 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.072934 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 22:04:45.073379 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.073311 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" containerID="cri-o://395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150" gracePeriod=30 Apr 24 22:04:45.088701 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.088645 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 22:04:45.089026 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.088998 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" containerID="cri-o://66defc4ba9cc14af6217e902ce6aa26cc12df163f86968f7f3f358a0fe7f7165" gracePeriod=30 Apr 24 22:04:45.089115 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.089024 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="storage-initializer" containerID="cri-o://0f3c575d34b109930a0b2b36f72e058f25ef1381ec2b9b3266ec15c4dd082a7a" gracePeriod=30 Apr 24 22:04:45.468844 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.468806 2580 generic.go:358] "Generic (PLEG): container finished" podID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerID="66defc4ba9cc14af6217e902ce6aa26cc12df163f86968f7f3f358a0fe7f7165" exitCode=0 Apr 24 22:04:45.469025 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:45.468861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerDied","Data":"66defc4ba9cc14af6217e902ce6aa26cc12df163f86968f7f3f358a0fe7f7165"} Apr 24 22:04:50.630067 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:50.630021 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:04:53.353041 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:53.352991 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:04:53.911281 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:04:53.911241 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 22:05:00.558274 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558239 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:05:00.558719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558576 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" Apr 24 22:05:00.558719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558587 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" Apr 24 22:05:00.558719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558604 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="storage-initializer" Apr 24 22:05:00.558719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558610 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="storage-initializer" Apr 24 22:05:00.558719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.558688 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="79d743c6-d0ed-4314-971f-3549b9056af0" containerName="main" Apr 24 22:05:00.560983 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.560965 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.565063 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.565037 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 24 22:05:00.565222 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.565087 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-zlbfk\"" Apr 24 22:05:00.569531 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.569499 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:05:00.572624 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.572598 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.583672 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.583592 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:05:00.587322 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.587289 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:05:00.630413 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.630374 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:05:00.658628 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658592 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.658819 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658639 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.658819 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658679 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.658819 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4c5j\" (UniqueName: \"kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658819 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658784 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658830 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgb7b\" (UniqueName: \"kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658892 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658912 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658927 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658942 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.658971 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.659183 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.658993 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.659183 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.659024 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760225 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760180 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760228 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgb7b\" (UniqueName: \"kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760266 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760320 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760344 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760394 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760420 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760458 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760539 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760561 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4c5j\" (UniqueName: \"kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.760879 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.761196 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.761196 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.760945 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.761196 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.761042 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.761400 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.761316 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.761400 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.761320 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.761400 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.761372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.761566 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.761445 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.763035 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.763011 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.763228 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.763210 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.763316 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.763298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.763689 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.763650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.770736 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.770708 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4c5j\" (UniqueName: \"kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j\") pod \"router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:00.771462 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.771434 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgb7b\" (UniqueName: \"kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b\") pod \"router-with-refs-pd-test-kserve-c8d6887cf-khlr5\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.872319 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.872214 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:00.884594 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:00.884567 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:01.026731 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.026703 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:05:01.028885 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:05:01.028853 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374952dd_33de_4046_878a_c2574579f174.slice/crio-9bddb5b1e6d12a9b51482a4e7fcde1a6601be0bbc914f8643bd5882dd77944f1 WatchSource:0}: Error finding container 9bddb5b1e6d12a9b51482a4e7fcde1a6601be0bbc914f8643bd5882dd77944f1: Status 404 returned error can't find the container with id 9bddb5b1e6d12a9b51482a4e7fcde1a6601be0bbc914f8643bd5882dd77944f1 Apr 24 22:05:01.047629 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.047553 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:05:01.049750 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:05:01.049720 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6aad58e0_06d6_49d3_aa91_93ae900e6bb2.slice/crio-b6767fe2c19d579b15f1aed31956332814fd43c62c50ae3f269ef92c1fba82d9 WatchSource:0}: Error finding container b6767fe2c19d579b15f1aed31956332814fd43c62c50ae3f269ef92c1fba82d9: Status 404 returned error can't find the container with id b6767fe2c19d579b15f1aed31956332814fd43c62c50ae3f269ef92c1fba82d9 Apr 24 22:05:01.526152 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.526111 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerStarted","Data":"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b"} Apr 24 22:05:01.526152 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.526156 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerStarted","Data":"9bddb5b1e6d12a9b51482a4e7fcde1a6601be0bbc914f8643bd5882dd77944f1"} Apr 24 22:05:01.526414 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.526206 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:01.527576 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.527544 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerStarted","Data":"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2"} Apr 24 22:05:01.527718 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:01.527583 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerStarted","Data":"b6767fe2c19d579b15f1aed31956332814fd43c62c50ae3f269ef92c1fba82d9"} Apr 24 22:05:02.534317 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:02.534281 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerStarted","Data":"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a"} Apr 24 22:05:03.353065 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:03.353011 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:05:03.924978 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:03.924945 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:05:03.934353 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:03.934318 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:05:06.553471 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:06.553435 2580 generic.go:358] "Generic (PLEG): container finished" podID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerID="21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2" exitCode=0 Apr 24 22:05:06.553949 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:06.553512 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerDied","Data":"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2"} Apr 24 22:05:06.555338 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:06.555313 2580 generic.go:358] "Generic (PLEG): container finished" podID="374952dd-33de-4046-878a-c2574579f174" containerID="8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a" exitCode=0 Apr 24 22:05:06.555456 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:06.555391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerDied","Data":"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a"} Apr 24 22:05:07.561044 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:07.561005 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerStarted","Data":"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399"} Apr 24 22:05:07.563078 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:07.563048 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerStarted","Data":"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d"} Apr 24 22:05:07.584670 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:07.584595 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podStartSLOduration=7.584574189 podStartE2EDuration="7.584574189s" podCreationTimestamp="2026-04-24 22:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:05:07.582527617 +0000 UTC m=+2321.566742396" watchObservedRunningTime="2026-04-24 22:05:07.584574189 +0000 UTC m=+2321.568788968" Apr 24 22:05:07.606143 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:07.606057 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podStartSLOduration=7.606038593 podStartE2EDuration="7.606038593s" podCreationTimestamp="2026-04-24 22:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:05:07.603233618 +0000 UTC m=+2321.587448397" watchObservedRunningTime="2026-04-24 22:05:07.606038593 +0000 UTC m=+2321.590253371" Apr 24 22:05:10.630179 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.630117 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:05:10.630648 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.630209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 22:05:10.872867 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.872828 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:10.873065 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.872882 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:10.874493 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.874460 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:05:10.885263 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.885179 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:10.885263 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.885224 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:05:10.886648 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:10.886615 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:05:13.353356 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:13.353301 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 22:05:15.379686 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.379643 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 22:05:15.507054 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507010 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507054 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507060 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqlbc\" (UniqueName: \"kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507321 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507123 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507321 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507158 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507321 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507183 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507321 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507231 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.507321 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507286 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs\") pod \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\" (UID: \"909fa8c1-24be-412e-bfb4-b9daa8b30bad\") " Apr 24 22:05:15.508006 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507760 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home" (OuterVolumeSpecName: "home") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.508006 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.507942 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache" (OuterVolumeSpecName: "model-cache") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.510171 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.510117 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:15.510171 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.510119 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm" (OuterVolumeSpecName: "dshm") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.510171 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.510146 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc" (OuterVolumeSpecName: "kube-api-access-pqlbc") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "kube-api-access-pqlbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:15.517901 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.517870 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.572602 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.572548 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "909fa8c1-24be-412e-bfb4-b9daa8b30bad" (UID: "909fa8c1-24be-412e-bfb4-b9daa8b30bad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.592607 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.592577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4_d7f0592c-64c4-4c25-a8b1-81653e9bf9a0/storage-initializer/0.log" Apr 24 22:05:15.593001 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.592978 2580 generic.go:358] "Generic (PLEG): container finished" podID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerID="0f3c575d34b109930a0b2b36f72e058f25ef1381ec2b9b3266ec15c4dd082a7a" exitCode=137 Apr 24 22:05:15.593128 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.593053 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerDied","Data":"0f3c575d34b109930a0b2b36f72e058f25ef1381ec2b9b3266ec15c4dd082a7a"} Apr 24 22:05:15.594695 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.594643 2580 generic.go:358] "Generic (PLEG): container finished" podID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerID="395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150" exitCode=137 Apr 24 22:05:15.594826 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.594697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerDied","Data":"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150"} Apr 24 22:05:15.594826 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.594736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" event={"ID":"909fa8c1-24be-412e-bfb4-b9daa8b30bad","Type":"ContainerDied","Data":"f99eca8c9c918035ea93be64d54efc444320302320f3ff3d35f2bee66ecf34be"} Apr 24 22:05:15.594826 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.594757 2580 scope.go:117] "RemoveContainer" containerID="395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150" Apr 24 22:05:15.594826 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.594757 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf" Apr 24 22:05:15.605521 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.605503 2580 scope.go:117] "RemoveContainer" containerID="ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70" Apr 24 22:05:15.608276 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608252 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608280 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608296 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pqlbc\" (UniqueName: \"kubernetes.io/projected/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kube-api-access-pqlbc\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608310 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608325 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608337 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.608369 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.608351 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/909fa8c1-24be-412e-bfb4-b9daa8b30bad-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:15.618558 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.618535 2580 scope.go:117] "RemoveContainer" containerID="395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150" Apr 24 22:05:15.618795 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.618698 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 22:05:15.618954 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:05:15.618934 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150\": container with ID starting with 395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150 not found: ID does not exist" containerID="395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150" Apr 24 22:05:15.619045 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.618964 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150"} err="failed to get container status \"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150\": rpc error: code = NotFound desc = could not find container \"395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150\": container with ID starting with 395217055dca73967ef81e1f934ec8040963e4364934eb709c5e7b276e6ef150 not found: ID does not exist" Apr 24 22:05:15.619045 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.618993 2580 scope.go:117] "RemoveContainer" containerID="ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70" Apr 24 22:05:15.623329 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:05:15.623086 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70\": container with ID starting with ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70 not found: ID does not exist" containerID="ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70" Apr 24 22:05:15.623329 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.623172 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70"} err="failed to get container status \"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70\": rpc error: code = NotFound desc = could not find container \"ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70\": container with ID starting with ac51728d31def846d2460455bfeddaba54c7a8f2e126b98477be8468ea8f2d70 not found: ID does not exist" Apr 24 22:05:15.635449 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.635416 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-5ffd6cfdbb-vqghf"] Apr 24 22:05:15.772034 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.772009 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4_d7f0592c-64c4-4c25-a8b1-81653e9bf9a0/storage-initializer/0.log" Apr 24 22:05:15.772460 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.772435 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 22:05:15.911310 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911265 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5tgf\" (UniqueName: \"kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911322 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911354 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911383 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911441 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911470 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911497 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm\") pod \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\" (UID: \"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0\") " Apr 24 22:05:15.911856 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.911756 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home" (OuterVolumeSpecName: "home") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.912121 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.912052 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.912121 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.912080 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.914336 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.914310 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf" (OuterVolumeSpecName: "kube-api-access-c5tgf") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "kube-api-access-c5tgf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:15.914336 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.914319 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:15.914513 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.914302 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm" (OuterVolumeSpecName: "dshm") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:15.969036 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:15.968975 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" (UID: "d7f0592c-64c4-4c25-a8b1-81653e9bf9a0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:16.012622 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012574 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5tgf\" (UniqueName: \"kubernetes.io/projected/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kube-api-access-c5tgf\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.012622 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012604 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.012622 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012615 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.012622 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012625 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.012622 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012634 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.013052 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012643 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.013052 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.012651 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:16.579728 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.579694 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" path="/var/lib/kubelet/pods/909fa8c1-24be-412e-bfb4-b9daa8b30bad/volumes" Apr 24 22:05:16.600583 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.600552 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4_d7f0592c-64c4-4c25-a8b1-81653e9bf9a0/storage-initializer/0.log" Apr 24 22:05:16.601044 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.601019 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" event={"ID":"d7f0592c-64c4-4c25-a8b1-81653e9bf9a0","Type":"ContainerDied","Data":"68188c01f5d86a5d5693c1d8bdbb005379a4b0e68c3e8cb2f9d2e433136157a6"} Apr 24 22:05:16.601044 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.601038 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4" Apr 24 22:05:16.601208 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.601067 2580 scope.go:117] "RemoveContainer" containerID="0f3c575d34b109930a0b2b36f72e058f25ef1381ec2b9b3266ec15c4dd082a7a" Apr 24 22:05:16.640079 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.640038 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 22:05:16.644085 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.644050 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5dc654c5d7-bknh4"] Apr 24 22:05:16.672281 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:16.672252 2580 scope.go:117] "RemoveContainer" containerID="66defc4ba9cc14af6217e902ce6aa26cc12df163f86968f7f3f358a0fe7f7165" Apr 24 22:05:18.580883 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:18.580841 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" path="/var/lib/kubelet/pods/d7f0592c-64c4-4c25-a8b1-81653e9bf9a0/volumes" Apr 24 22:05:20.705454 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:20.705421 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:05:20.706415 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:20.706383 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" containerID="cri-o://f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6" gracePeriod=30 Apr 24 22:05:20.873080 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:20.873037 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:05:20.885151 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:20.885103 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:05:20.890307 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:20.890277 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:05:30.873345 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:30.873287 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:05:30.885613 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:30.885563 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:05:40.873158 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:40.873103 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:05:40.885455 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:40.885414 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:05:50.872992 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:50.872924 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:05:50.885907 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:50.885458 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:05:50.987400 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:50.987374 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6f89b754f-6qfzv_3142eb2a-18c7-4142-a559-62b52c768f20/main/0.log" Apr 24 22:05:50.987843 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:50.987822 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:05:51.041949 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.041900 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.041949 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.041950 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042200 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.041999 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042200 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042053 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvjsn\" (UniqueName: \"kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042200 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042108 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042200 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042144 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042200 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042169 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location\") pod \"3142eb2a-18c7-4142-a559-62b52c768f20\" (UID: \"3142eb2a-18c7-4142-a559-62b52c768f20\") " Apr 24 22:05:51.042936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042730 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache" (OuterVolumeSpecName: "model-cache") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:51.042936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.042837 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home" (OuterVolumeSpecName: "home") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:51.044529 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.044493 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:05:51.044922 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.044895 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn" (OuterVolumeSpecName: "kube-api-access-kvjsn") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "kube-api-access-kvjsn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:05:51.046869 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.046841 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm" (OuterVolumeSpecName: "dshm") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:51.056040 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.056001 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:51.117438 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.117375 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3142eb2a-18c7-4142-a559-62b52c768f20" (UID: "3142eb2a-18c7-4142-a559-62b52c768f20"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:05:51.143936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143888 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.143936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143925 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.143936 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143939 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3142eb2a-18c7-4142-a559-62b52c768f20-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.144247 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143954 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kvjsn\" (UniqueName: \"kubernetes.io/projected/3142eb2a-18c7-4142-a559-62b52c768f20-kube-api-access-kvjsn\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.144247 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143965 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.144247 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143979 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.144247 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.143990 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3142eb2a-18c7-4142-a559-62b52c768f20-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:05:51.733453 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733421 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6f89b754f-6qfzv_3142eb2a-18c7-4142-a559-62b52c768f20/main/0.log" Apr 24 22:05:51.733812 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733781 2580 generic.go:358] "Generic (PLEG): container finished" podID="3142eb2a-18c7-4142-a559-62b52c768f20" containerID="f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6" exitCode=137 Apr 24 22:05:51.733935 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerDied","Data":"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6"} Apr 24 22:05:51.733935 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733909 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" event={"ID":"3142eb2a-18c7-4142-a559-62b52c768f20","Type":"ContainerDied","Data":"7d79d096fa166e4aed0c69163710847034b034b3c907f9ea07873c4b259e28ab"} Apr 24 22:05:51.733935 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733926 2580 scope.go:117] "RemoveContainer" containerID="f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6" Apr 24 22:05:51.734053 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.733875 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv" Apr 24 22:05:51.742404 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.742383 2580 scope.go:117] "RemoveContainer" containerID="335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e" Apr 24 22:05:51.759871 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.759760 2580 scope.go:117] "RemoveContainer" containerID="f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6" Apr 24 22:05:51.760161 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:05:51.760130 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6\": container with ID starting with f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6 not found: ID does not exist" containerID="f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6" Apr 24 22:05:51.760318 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.760172 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6"} err="failed to get container status \"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6\": rpc error: code = NotFound desc = could not find container \"f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6\": container with ID starting with f90af7f8aae5118b6bed9f209e73614e125f7220c3efe9228ea25375e61a63c6 not found: ID does not exist" Apr 24 22:05:51.760318 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.760207 2580 scope.go:117] "RemoveContainer" containerID="335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e" Apr 24 22:05:51.760318 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.760293 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:05:51.760893 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:05:51.760864 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e\": container with ID starting with 335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e not found: ID does not exist" containerID="335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e" Apr 24 22:05:51.761008 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.760902 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e"} err="failed to get container status \"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e\": rpc error: code = NotFound desc = could not find container \"335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e\": container with ID starting with 335836e8515392a454b38915e998cfd77f735cfcdb77860bb5c181f62ca03b2e not found: ID does not exist" Apr 24 22:05:51.764050 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:51.764026 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6f89b754f-6qfzv"] Apr 24 22:05:52.579236 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:05:52.579197 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" path="/var/lib/kubelet/pods/3142eb2a-18c7-4142-a559-62b52c768f20/volumes" Apr 24 22:06:00.873462 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:00.873396 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:00.885379 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:00.885336 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:06:10.873276 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:10.873221 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:10.885648 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:10.885605 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:06:20.873163 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:20.873098 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:20.885645 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:20.885605 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:06:26.631856 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:26.631823 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:06:26.639055 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:26.639029 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:06:30.873237 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:30.873185 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:30.885593 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:30.885545 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:06:34.338104 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338053 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338391 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338403 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338411 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338417 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338427 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338433 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338440 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338445 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338451 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338456 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338463 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338468 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338515 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="storage-initializer" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338524 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="909fa8c1-24be-412e-bfb4-b9daa8b30bad" containerName="main" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338530 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7f0592c-64c4-4c25-a8b1-81653e9bf9a0" containerName="llm-d-routing-sidecar" Apr 24 22:06:34.338750 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.338536 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3142eb2a-18c7-4142-a559-62b52c768f20" containerName="main" Apr 24 22:06:34.342861 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.342834 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.346059 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.346027 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-gnhtx\"" Apr 24 22:06:34.346191 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.346079 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 22:06:34.364426 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.364385 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:06:34.418443 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418451 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418513 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418554 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418602 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgkp5\" (UniqueName: \"kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.418679 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.418630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.519950 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.519900 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520159 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.519986 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520159 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.520051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520159 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.520102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520159 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.520132 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520381 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.520158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.520381 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.520240 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgkp5\" (UniqueName: \"kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.521115 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.521081 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.521318 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.521296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.521510 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.521488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.526186 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.526156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.528319 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.528271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.528797 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.528514 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.530006 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.529985 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgkp5\" (UniqueName: \"kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.653690 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.653640 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:06:34.794323 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.794236 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:06:34.798190 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:06:34.798156 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc154fa61_5139_4e2d_b2d9_345eee5fa7e4.slice/crio-ae4a45e8f7e2397922f317a05ea6e74c3e218ae99bd4c9ad7d79a55c64261c41 WatchSource:0}: Error finding container ae4a45e8f7e2397922f317a05ea6e74c3e218ae99bd4c9ad7d79a55c64261c41: Status 404 returned error can't find the container with id ae4a45e8f7e2397922f317a05ea6e74c3e218ae99bd4c9ad7d79a55c64261c41 Apr 24 22:06:34.884805 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.884750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerStarted","Data":"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4"} Apr 24 22:06:34.884805 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:34.884796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerStarted","Data":"ae4a45e8f7e2397922f317a05ea6e74c3e218ae99bd4c9ad7d79a55c64261c41"} Apr 24 22:06:40.873053 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:40.873003 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:40.885029 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:40.884978 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:06:50.873031 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:50.872976 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:06:50.885138 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:06:50.885090 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:07:00.873497 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:00.873437 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:07:00.885943 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:00.885904 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:07:10.873602 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:10.873545 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:07:10.885938 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:10.885888 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:07:20.872976 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:20.872925 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8001/health\": dial tcp 10.132.0.42:8001: connect: connection refused" Apr 24 22:07:20.885907 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:20.885863 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" probeResult="failure" output="Get \"https://10.132.0.43:8000/health\": dial tcp 10.132.0.43:8000: connect: connection refused" Apr 24 22:07:30.888813 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:30.888778 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:07:30.895990 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:30.895966 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:07:30.904114 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:30.904084 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:07:30.904835 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:30.904816 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:07:42.846552 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:42.846514 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:07:42.847078 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:42.846900 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" containerID="cri-o://d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" gracePeriod=30 Apr 24 22:07:42.852733 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:42.852704 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:07:42.853001 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:42.852978 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" containerID="cri-o://36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399" gracePeriod=30 Apr 24 22:07:47.956811 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:47.956776 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:07:47.961601 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:47.961580 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:47.963751 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:47.963726 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 24 22:07:47.971785 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:47.971757 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:07:48.094288 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094247 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094474 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094300 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094474 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094348 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z42bt\" (UniqueName: \"kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094474 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094474 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094435 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094616 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094490 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.094616 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.094517 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.194984 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.194946 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195181 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.194991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195181 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195017 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z42bt\" (UniqueName: \"kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195181 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195181 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195088 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195376 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195376 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195287 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195495 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195474 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195554 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195610 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.195647 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.195627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.198034 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.198012 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.198226 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.198207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.202878 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.202858 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z42bt\" (UniqueName: \"kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.271926 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.271816 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:07:48.416187 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:48.416122 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:07:48.422318 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:07:48.422280 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ac6250_7e00_4302_94b4_8c94b7e3c434.slice/crio-b38064159c5dcfc1af266bfed649da238b55f1044767443dfd9df32b6ecb8580 WatchSource:0}: Error finding container b38064159c5dcfc1af266bfed649da238b55f1044767443dfd9df32b6ecb8580: Status 404 returned error can't find the container with id b38064159c5dcfc1af266bfed649da238b55f1044767443dfd9df32b6ecb8580 Apr 24 22:07:49.149856 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:49.149815 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerStarted","Data":"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9"} Apr 24 22:07:49.149856 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:07:49.149863 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerStarted","Data":"b38064159c5dcfc1af266bfed649da238b55f1044767443dfd9df32b6ecb8580"} Apr 24 22:08:00.192673 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:00.192634 2580 generic.go:358] "Generic (PLEG): container finished" podID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerID="e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4" exitCode=0 Apr 24 22:08:00.193304 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:00.192712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerDied","Data":"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4"} Apr 24 22:08:01.197867 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:01.197827 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerStarted","Data":"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa"} Apr 24 22:08:01.218669 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:01.218600 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=87.218584572 podStartE2EDuration="1m27.218584572s" podCreationTimestamp="2026-04-24 22:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:08:01.216679537 +0000 UTC m=+2495.200894317" watchObservedRunningTime="2026-04-24 22:08:01.218584572 +0000 UTC m=+2495.202799348" Apr 24 22:08:04.654250 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:04.654207 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:08:04.654698 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:04.654360 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:08:04.656001 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:04.655972 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:08:12.847469 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:12.847365 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="llm-d-routing-sidecar" containerID="cri-o://c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" gracePeriod=2 Apr 24 22:08:13.143820 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.143791 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:08:13.146823 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.146802 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-c8d6887cf-khlr5_374952dd-33de-4046-878a-c2574579f174/main/0.log" Apr 24 22:08:13.147447 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.147430 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:08:13.209222 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209171 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209416 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209265 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209416 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209292 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209416 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209335 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209634 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209409 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209634 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209502 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209634 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209545 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgb7b\" (UniqueName: \"kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209634 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209585 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209634 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209608 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209924 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209695 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4c5j\" (UniqueName: \"kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209924 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209729 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209924 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209755 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.209924 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209794 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache\") pod \"374952dd-33de-4046-878a-c2574579f174\" (UID: \"374952dd-33de-4046-878a-c2574579f174\") " Apr 24 22:08:13.209924 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.209825 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.210193 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.210119 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home" (OuterVolumeSpecName: "home") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.213132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.212607 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm" (OuterVolumeSpecName: "dshm") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.213132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.212862 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache" (OuterVolumeSpecName: "model-cache") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.213132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.213020 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b" (OuterVolumeSpecName: "kube-api-access-dgb7b") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "kube-api-access-dgb7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:13.213377 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.213200 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:13.214709 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.214681 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home" (OuterVolumeSpecName: "home") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.215061 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.215033 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache" (OuterVolumeSpecName: "model-cache") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.215830 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.215782 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j" (OuterVolumeSpecName: "kube-api-access-n4c5j") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "kube-api-access-n4c5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:08:13.215962 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.215938 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:08:13.216394 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.216366 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm" (OuterVolumeSpecName: "dshm") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.225798 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.225751 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.230224 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.230177 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.241258 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.241224 2580 generic.go:358] "Generic (PLEG): container finished" podID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerID="36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399" exitCode=137 Apr 24 22:08:13.241439 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.241299 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerDied","Data":"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399"} Apr 24 22:08:13.241439 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.241339 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" event={"ID":"6aad58e0-06d6-49d3-aa91-93ae900e6bb2","Type":"ContainerDied","Data":"b6767fe2c19d579b15f1aed31956332814fd43c62c50ae3f269ef92c1fba82d9"} Apr 24 22:08:13.241439 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.241339 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h" Apr 24 22:08:13.241439 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.241359 2580 scope.go:117] "RemoveContainer" containerID="36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399" Apr 24 22:08:13.242855 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.242835 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-c8d6887cf-khlr5_374952dd-33de-4046-878a-c2574579f174/main/0.log" Apr 24 22:08:13.243745 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243621 2580 generic.go:358] "Generic (PLEG): container finished" podID="374952dd-33de-4046-878a-c2574579f174" containerID="d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" exitCode=137 Apr 24 22:08:13.243745 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243650 2580 generic.go:358] "Generic (PLEG): container finished" podID="374952dd-33de-4046-878a-c2574579f174" containerID="c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" exitCode=0 Apr 24 22:08:13.243919 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243759 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" Apr 24 22:08:13.243919 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243756 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerDied","Data":"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d"} Apr 24 22:08:13.243919 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243877 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerDied","Data":"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b"} Apr 24 22:08:13.243919 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.243918 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5" event={"ID":"374952dd-33de-4046-878a-c2574579f174","Type":"ContainerDied","Data":"9bddb5b1e6d12a9b51482a4e7fcde1a6601be0bbc914f8643bd5882dd77944f1"} Apr 24 22:08:13.253269 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.253240 2580 scope.go:117] "RemoveContainer" containerID="21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2" Apr 24 22:08:13.308760 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.308716 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "374952dd-33de-4046-878a-c2574579f174" (UID: "374952dd-33de-4046-878a-c2574579f174"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.310430 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310383 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.310740 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310721 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") pod \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\" (UID: \"6aad58e0-06d6-49d3-aa91-93ae900e6bb2\") " Apr 24 22:08:13.310868 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:08:13.310849 2580 empty_dir.go:511] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6aad58e0-06d6-49d3-aa91-93ae900e6bb2/volumes/kubernetes.io~empty-dir/kserve-provision-location Apr 24 22:08:13.310926 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310872 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6aad58e0-06d6-49d3-aa91-93ae900e6bb2" (UID: "6aad58e0-06d6-49d3-aa91-93ae900e6bb2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:08:13.310982 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310971 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n4c5j\" (UniqueName: \"kubernetes.io/projected/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kube-api-access-n4c5j\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310987 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.310996 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311006 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311014 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311024 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311032 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311038 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311039 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311050 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311063 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/374952dd-33de-4046-878a-c2574579f174-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311072 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311082 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dgb7b\" (UniqueName: \"kubernetes.io/projected/374952dd-33de-4046-878a-c2574579f174-kube-api-access-dgb7b\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311094 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6aad58e0-06d6-49d3-aa91-93ae900e6bb2-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.311388 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.311103 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/374952dd-33de-4046-878a-c2574579f174-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:08:13.321806 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.321784 2580 scope.go:117] "RemoveContainer" containerID="36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399" Apr 24 22:08:13.322195 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:08:13.322174 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399\": container with ID starting with 36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399 not found: ID does not exist" containerID="36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399" Apr 24 22:08:13.322268 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.322205 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399"} err="failed to get container status \"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399\": rpc error: code = NotFound desc = could not find container \"36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399\": container with ID starting with 36bb2d32c2a7e06244f195d14166ca8ada55855cb7b4f56ecbe6f839cea68399 not found: ID does not exist" Apr 24 22:08:13.322268 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.322225 2580 scope.go:117] "RemoveContainer" containerID="21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2" Apr 24 22:08:13.322497 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:08:13.322481 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2\": container with ID starting with 21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2 not found: ID does not exist" containerID="21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2" Apr 24 22:08:13.322550 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.322504 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2"} err="failed to get container status \"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2\": rpc error: code = NotFound desc = could not find container \"21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2\": container with ID starting with 21f874b275b12669b791c91b9d3ece6cc13bf45ef92850959a17bdced48c5eb2 not found: ID does not exist" Apr 24 22:08:13.322550 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.322520 2580 scope.go:117] "RemoveContainer" containerID="d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" Apr 24 22:08:13.332437 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.332414 2580 scope.go:117] "RemoveContainer" containerID="8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a" Apr 24 22:08:13.394986 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.394961 2580 scope.go:117] "RemoveContainer" containerID="c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" Apr 24 22:08:13.404232 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.404198 2580 scope.go:117] "RemoveContainer" containerID="d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" Apr 24 22:08:13.404533 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:08:13.404508 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d\": container with ID starting with d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d not found: ID does not exist" containerID="d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" Apr 24 22:08:13.404623 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.404544 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d"} err="failed to get container status \"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d\": rpc error: code = NotFound desc = could not find container \"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d\": container with ID starting with d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d not found: ID does not exist" Apr 24 22:08:13.404623 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.404571 2580 scope.go:117] "RemoveContainer" containerID="8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a" Apr 24 22:08:13.404921 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:08:13.404893 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a\": container with ID starting with 8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a not found: ID does not exist" containerID="8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a" Apr 24 22:08:13.405045 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.404928 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a"} err="failed to get container status \"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a\": rpc error: code = NotFound desc = could not find container \"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a\": container with ID starting with 8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a not found: ID does not exist" Apr 24 22:08:13.405045 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.404955 2580 scope.go:117] "RemoveContainer" containerID="c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" Apr 24 22:08:13.405206 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:08:13.405187 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b\": container with ID starting with c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b not found: ID does not exist" containerID="c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" Apr 24 22:08:13.405255 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405211 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b"} err="failed to get container status \"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b\": rpc error: code = NotFound desc = could not find container \"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b\": container with ID starting with c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b not found: ID does not exist" Apr 24 22:08:13.405255 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405227 2580 scope.go:117] "RemoveContainer" containerID="d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d" Apr 24 22:08:13.405508 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405475 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d"} err="failed to get container status \"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d\": rpc error: code = NotFound desc = could not find container \"d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d\": container with ID starting with d5c9a9df792fdb90cee22bb2b2a5128ca933134c7f82e0688af285a16702383d not found: ID does not exist" Apr 24 22:08:13.405508 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405498 2580 scope.go:117] "RemoveContainer" containerID="8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a" Apr 24 22:08:13.405748 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405729 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a"} err="failed to get container status \"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a\": rpc error: code = NotFound desc = could not find container \"8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a\": container with ID starting with 8b3bfd161fdb421434c9b827f971274a254849d31e3e9d8bbe0f1eefd494da7a not found: ID does not exist" Apr 24 22:08:13.405748 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405747 2580 scope.go:117] "RemoveContainer" containerID="c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b" Apr 24 22:08:13.405990 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.405969 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b"} err="failed to get container status \"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b\": rpc error: code = NotFound desc = could not find container \"c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b\": container with ID starting with c4b01c3a10d11358526bc49cc9baf85a9bb223a79a92ca84d3fad7874ebe637b not found: ID does not exist" Apr 24 22:08:13.566722 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.566676 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:08:13.571300 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.571255 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-844c545574-9vx4h"] Apr 24 22:08:13.582949 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.582916 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:08:13.586955 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:13.586925 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-c8d6887cf-khlr5"] Apr 24 22:08:14.578560 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:14.578516 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="374952dd-33de-4046-878a-c2574579f174" path="/var/lib/kubelet/pods/374952dd-33de-4046-878a-c2574579f174/volumes" Apr 24 22:08:14.579061 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:14.579002 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" path="/var/lib/kubelet/pods/6aad58e0-06d6-49d3-aa91-93ae900e6bb2/volumes" Apr 24 22:08:14.654900 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:14.654844 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:08:24.655148 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:24.655098 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:08:34.655116 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:34.655070 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:08:44.655026 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:44.654977 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:08:54.654824 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:08:54.654773 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:09:04.654153 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:04.654096 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:09:14.654749 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:14.654701 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:09:24.654598 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:24.654542 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:09:34.654496 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:34.654440 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.44:8000/health\": dial tcp 10.132.0.44:8000: connect: connection refused" Apr 24 22:09:44.664137 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:44.664059 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:09:44.672138 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:44.672106 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:09:53.071633 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.071593 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:09:53.072247 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.072003 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" containerID="cri-o://d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa" gracePeriod=30 Apr 24 22:09:53.815116 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.815090 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:09:53.905875 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905781 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.905875 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905819 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgkp5\" (UniqueName: \"kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.905875 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905844 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.905875 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905876 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.906239 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905914 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.906239 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905948 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.906239 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.905979 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location\") pod \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\" (UID: \"c154fa61-5139-4e2d-b2d9-345eee5fa7e4\") " Apr 24 22:09:53.906239 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.906113 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache" (OuterVolumeSpecName: "model-cache") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:53.906514 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.906324 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:53.906719 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.906686 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home" (OuterVolumeSpecName: "home") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:53.908552 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.908521 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm" (OuterVolumeSpecName: "dshm") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:53.908692 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.908541 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5" (OuterVolumeSpecName: "kube-api-access-vgkp5") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "kube-api-access-vgkp5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:09:53.908932 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.908900 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:09:53.923008 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.922969 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:53.969155 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:53.969106 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c154fa61-5139-4e2d-b2d9-345eee5fa7e4" (UID: "c154fa61-5139-4e2d-b2d9-345eee5fa7e4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:09:54.007338 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007300 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.007338 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007332 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.007338 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007340 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.007338 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007349 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.007614 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007358 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.007614 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.007367 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vgkp5\" (UniqueName: \"kubernetes.io/projected/c154fa61-5139-4e2d-b2d9-345eee5fa7e4-kube-api-access-vgkp5\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:09:54.585534 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.585221 2580 generic.go:358] "Generic (PLEG): container finished" podID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerID="d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa" exitCode=0 Apr 24 22:09:54.585534 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.585282 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerDied","Data":"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa"} Apr 24 22:09:54.585534 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.585307 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"c154fa61-5139-4e2d-b2d9-345eee5fa7e4","Type":"ContainerDied","Data":"ae4a45e8f7e2397922f317a05ea6e74c3e218ae99bd4c9ad7d79a55c64261c41"} Apr 24 22:09:54.585534 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.585327 2580 scope.go:117] "RemoveContainer" containerID="d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa" Apr 24 22:09:54.585534 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.585358 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:09:54.593724 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.593703 2580 scope.go:117] "RemoveContainer" containerID="e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4" Apr 24 22:09:54.604868 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.604830 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:09:54.605485 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.605449 2580 scope.go:117] "RemoveContainer" containerID="d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa" Apr 24 22:09:54.605866 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:09:54.605841 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa\": container with ID starting with d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa not found: ID does not exist" containerID="d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa" Apr 24 22:09:54.605963 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.605874 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa"} err="failed to get container status \"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa\": rpc error: code = NotFound desc = could not find container \"d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa\": container with ID starting with d2a049b81e67f0b823459a684eb81e509bd075390e692b2d9f1a690c563612fa not found: ID does not exist" Apr 24 22:09:54.605963 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.605895 2580 scope.go:117] "RemoveContainer" containerID="e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4" Apr 24 22:09:54.606214 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:09:54.606196 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4\": container with ID starting with e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4 not found: ID does not exist" containerID="e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4" Apr 24 22:09:54.606266 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.606229 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4"} err="failed to get container status \"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4\": rpc error: code = NotFound desc = could not find container \"e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4\": container with ID starting with e1ac57f09f3c9cce7290676768843ef5e37ce638b21183ed411d3053be3220e4 not found: ID does not exist" Apr 24 22:09:54.606704 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:54.606679 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:09:56.577207 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:09:56.577173 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" path="/var/lib/kubelet/pods/c154fa61-5139-4e2d-b2d9-345eee5fa7e4/volumes" Apr 24 22:11:26.653959 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:11:26.653877 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:11:26.661468 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:11:26.661444 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:12:26.043437 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:26.043399 2580 generic.go:358] "Generic (PLEG): container finished" podID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerID="d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9" exitCode=0 Apr 24 22:12:26.043902 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:26.043476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerDied","Data":"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9"} Apr 24 22:12:26.044611 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:26.044595 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:12:27.047372 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:27.047335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerStarted","Data":"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe"} Apr 24 22:12:27.069801 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:27.069747 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podStartSLOduration=280.069726665 podStartE2EDuration="4m40.069726665s" podCreationTimestamp="2026-04-24 22:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:12:27.067245493 +0000 UTC m=+2761.051460273" watchObservedRunningTime="2026-04-24 22:12:27.069726665 +0000 UTC m=+2761.053941447" Apr 24 22:12:28.272451 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:28.272405 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:12:28.272451 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:28.272461 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:12:28.274031 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:28.274001 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:12:38.273328 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:38.273273 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:12:48.272779 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:48.272690 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:12:58.273283 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:12:58.273242 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:08.272508 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:08.272450 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:18.273189 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:18.273133 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:28.273039 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:28.272995 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:38.275019 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:38.274968 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:48.272786 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:48.272737 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:13:58.273009 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:13:58.272959 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" probeResult="failure" output="Get \"https://10.132.0.45:8000/health\": dial tcp 10.132.0.45:8000: connect: connection refused" Apr 24 22:14:08.282073 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:08.282031 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:14:08.290072 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:08.290043 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:14:12.631945 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:12.631866 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:14:12.632436 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:12.632161 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" containerID="cri-o://febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe" gracePeriod=30 Apr 24 22:14:28.242878 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:28.242848 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:28.261493 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:28.261465 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:29.243272 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:29.243241 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:29.251649 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:29.251619 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:30.244023 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:30.243988 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:30.257419 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:30.257382 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:31.236325 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:31.236292 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:31.247295 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:31.247261 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:32.215307 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:32.215275 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:32.224767 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:32.224741 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:33.198992 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:33.198964 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:33.206815 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:33.206791 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:34.155370 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:34.155333 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:34.163378 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:34.163349 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:35.110990 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:35.110959 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:35.120807 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:35.120771 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:36.081171 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:36.081142 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:36.089182 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:36.089153 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:37.026862 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:37.026830 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:37.034638 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:37.034612 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:37.988458 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:37.988421 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:38.000059 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:38.000031 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:38.957796 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:38.957761 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:38.967826 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:38.967791 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:39.986120 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:39.986083 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:39.995319 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:39.995297 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:41.011970 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:41.011943 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:41.022916 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:41.022889 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/storage-initializer/0.log" Apr 24 22:14:42.101267 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.101215 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66f8f496b4-4gdhz_1b235dfc-3b04-476b-ac23-2c6473035a29/router/0.log" Apr 24 22:14:42.886205 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.886139 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:42.886495 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.886478 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:14:42.934729 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.934701 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66f8f496b4-4gdhz_1b235dfc-3b04-476b-ac23-2c6473035a29/router/0.log" Apr 24 22:14:42.943619 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943596 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943624 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943651 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943730 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943888 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943761 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943888 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943802 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z42bt\" (UniqueName: \"kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.943888 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.943840 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs\") pod \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\" (UID: \"e8ac6250-7e00-4302-94b4-8c94b7e3c434\") " Apr 24 22:14:42.944139 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.944109 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache" (OuterVolumeSpecName: "model-cache") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:42.944452 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.944417 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home" (OuterVolumeSpecName: "home") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:42.946226 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.946188 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt" (OuterVolumeSpecName: "kube-api-access-z42bt") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "kube-api-access-z42bt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:14:42.946637 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.946609 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm" (OuterVolumeSpecName: "dshm") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:42.946637 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.946620 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:14:42.962958 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:42.962915 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:43.013861 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.013814 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8ac6250-7e00-4302-94b4-8c94b7e3c434" (UID: "e8ac6250-7e00-4302-94b4-8c94b7e3c434"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:14:43.045070 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045036 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tmp-dir\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045070 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045067 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-model-cache\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045219 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045079 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z42bt\" (UniqueName: \"kubernetes.io/projected/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kube-api-access-z42bt\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045219 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045090 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e8ac6250-7e00-4302-94b4-8c94b7e3c434-tls-certs\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045219 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045099 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-dshm\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045219 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045107 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-home\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.045219 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.045115 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ac6250-7e00-4302-94b4-8c94b7e3c434-kserve-provision-location\") on node \"ip-10-0-131-237.ec2.internal\" DevicePath \"\"" Apr 24 22:14:43.471235 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471201 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl_e8ac6250-7e00-4302-94b4-8c94b7e3c434/main/0.log" Apr 24 22:14:43.471758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471540 2580 generic.go:358] "Generic (PLEG): container finished" podID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerID="febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe" exitCode=137 Apr 24 22:14:43.471758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471624 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" Apr 24 22:14:43.471758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471635 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerDied","Data":"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe"} Apr 24 22:14:43.471758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471707 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl" event={"ID":"e8ac6250-7e00-4302-94b4-8c94b7e3c434","Type":"ContainerDied","Data":"b38064159c5dcfc1af266bfed649da238b55f1044767443dfd9df32b6ecb8580"} Apr 24 22:14:43.471758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.471728 2580 scope.go:117] "RemoveContainer" containerID="febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe" Apr 24 22:14:43.480537 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.480513 2580 scope.go:117] "RemoveContainer" containerID="d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9" Apr 24 22:14:43.494123 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.494095 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:14:43.497675 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.497635 2580 scope.go:117] "RemoveContainer" containerID="febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe" Apr 24 22:14:43.498026 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:14:43.497996 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe\": container with ID starting with febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe not found: ID does not exist" containerID="febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe" Apr 24 22:14:43.498132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.498037 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe"} err="failed to get container status \"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe\": rpc error: code = NotFound desc = could not find container \"febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe\": container with ID starting with febe9acfd8dc627ab6738d3e9d471f586d89f348879047e557f679709b7e77fe not found: ID does not exist" Apr 24 22:14:43.498132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.498063 2580 scope.go:117] "RemoveContainer" containerID="d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9" Apr 24 22:14:43.498252 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.498165 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7f4d7d77c7k74kl"] Apr 24 22:14:43.498360 ip-10-0-131-237 kubenswrapper[2580]: E0424 22:14:43.498336 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9\": container with ID starting with d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9 not found: ID does not exist" containerID="d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9" Apr 24 22:14:43.498401 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.498369 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9"} err="failed to get container status \"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9\": rpc error: code = NotFound desc = could not find container \"d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9\": container with ID starting with d7f59c4503049d91c5ffdb7bb82c8f91cbd3a8e2b53d52385de55af7a20d7ec9 not found: ID does not exist" Apr 24 22:14:43.692446 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.692405 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-rdwcv_e680c157-9498-4ee6-ae79-941f7eb8f874/authorino/0.log" Apr 24 22:14:43.746801 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:43.746716 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-7zfmp_be5848e5-3de3-438f-b807-9f00b1969cd4/kuadrant-console-plugin/0.log" Apr 24 22:14:44.577465 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:44.577432 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" path="/var/lib/kubelet/pods/e8ac6250-7e00-4302-94b4-8c94b7e3c434/volumes" Apr 24 22:14:49.022713 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:49.022669 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vqqh6_4fcf0e65-bdc1-401b-98e2-00ff3294162f/global-pull-secret-syncer/0.log" Apr 24 22:14:49.138817 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:49.138783 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vnffz_7ae4eef7-d0e0-4916-bb0a-0752f2af5c3e/konnectivity-agent/0.log" Apr 24 22:14:49.191218 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:49.191166 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-237.ec2.internal_d266bc3b3ff16c73161a2d12b87975ef/haproxy/0.log" Apr 24 22:14:52.935103 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:52.935050 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-rdwcv_e680c157-9498-4ee6-ae79-941f7eb8f874/authorino/0.log" Apr 24 22:14:53.074592 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:53.074562 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-7zfmp_be5848e5-3de3-438f-b807-9f00b1969cd4/kuadrant-console-plugin/0.log" Apr 24 22:14:54.445315 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:54.445285 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-pb7ns_e1461942-917b-4737-86c2-fbe05a16beae/cluster-monitoring-operator/0.log" Apr 24 22:14:54.613280 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:54.613254 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7mgnq_b7c71279-6058-4d71-996a-5f54db9b0320/node-exporter/0.log" Apr 24 22:14:54.636796 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:54.636764 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7mgnq_b7c71279-6058-4d71-996a-5f54db9b0320/kube-rbac-proxy/0.log" Apr 24 22:14:54.657998 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:54.657963 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7mgnq_b7c71279-6058-4d71-996a-5f54db9b0320/init-textfile/0.log" Apr 24 22:14:56.706132 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:56.706097 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-8bbnh_0157928b-5de1-4b95-b80f-c4ebf799bce3/networking-console-plugin/0.log" Apr 24 22:14:57.247367 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:57.247336 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/1.log" Apr 24 22:14:57.251608 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:57.251569 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xngnk_6ad3b0df-323a-4ac2-bc1a-d5da7af6e8fd/console-operator/2.log" Apr 24 22:14:58.078891 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.078855 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k"] Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079146 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079159 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079169 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079174 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079183 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079190 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079200 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079205 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079213 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079218 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079225 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="llm-d-routing-sidecar" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079230 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="llm-d-routing-sidecar" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079237 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079241 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079249 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079254 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="storage-initializer" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079259 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" Apr 24 22:14:58.079277 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079263 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" Apr 24 22:14:58.079864 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079309 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c154fa61-5139-4e2d-b2d9-345eee5fa7e4" containerName="main" Apr 24 22:14:58.079864 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079318 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="main" Apr 24 22:14:58.079864 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079324 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="6aad58e0-06d6-49d3-aa91-93ae900e6bb2" containerName="main" Apr 24 22:14:58.079864 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079331 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="374952dd-33de-4046-878a-c2574579f174" containerName="llm-d-routing-sidecar" Apr 24 22:14:58.079864 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.079339 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8ac6250-7e00-4302-94b4-8c94b7e3c434" containerName="main" Apr 24 22:14:58.084791 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.084766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.087091 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.087069 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lj9kt\"/\"default-dockercfg-w6nqf\"" Apr 24 22:14:58.087232 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.087067 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lj9kt\"/\"openshift-service-ca.crt\"" Apr 24 22:14:58.087904 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.087881 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lj9kt\"/\"kube-root-ca.crt\"" Apr 24 22:14:58.092969 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.092945 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k"] Apr 24 22:14:58.178260 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.178217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-lib-modules\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.178453 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.178346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-sys\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.178453 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.178391 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-podres\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.178453 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.178416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-proc\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.178453 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.178433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jcb6\" (UniqueName: \"kubernetes.io/projected/36cd0907-96cd-48cc-9f05-cbc68f662f4e-kube-api-access-4jcb6\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.241232 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.241200 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zqdt8_bc8aa199-ad2f-4557-9430-ad968419174b/volume-data-source-validator/0.log" Apr 24 22:14:58.279604 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279562 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-podres\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279604 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279610 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-proc\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jcb6\" (UniqueName: \"kubernetes.io/projected/36cd0907-96cd-48cc-9f05-cbc68f662f4e-kube-api-access-4jcb6\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-lib-modules\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279731 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-proc\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279751 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-podres\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-sys\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-lib-modules\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.279885 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.279871 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/36cd0907-96cd-48cc-9f05-cbc68f662f4e-sys\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.287769 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.287733 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jcb6\" (UniqueName: \"kubernetes.io/projected/36cd0907-96cd-48cc-9f05-cbc68f662f4e-kube-api-access-4jcb6\") pod \"perf-node-gather-daemonset-4m46k\" (UID: \"36cd0907-96cd-48cc-9f05-cbc68f662f4e\") " pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.395995 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.395965 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:58.521961 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:58.521929 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k"] Apr 24 22:14:58.525421 ip-10-0-131-237 kubenswrapper[2580]: W0424 22:14:58.525390 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod36cd0907_96cd_48cc_9f05_cbc68f662f4e.slice/crio-17602556f7f348e1a43e5898b651aecee08776e30eec90367b21aa37eeaba921 WatchSource:0}: Error finding container 17602556f7f348e1a43e5898b651aecee08776e30eec90367b21aa37eeaba921: Status 404 returned error can't find the container with id 17602556f7f348e1a43e5898b651aecee08776e30eec90367b21aa37eeaba921 Apr 24 22:14:59.065928 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.065897 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lfbbd_ac4d827d-3c57-468f-bb34-d01bb87a171e/dns/0.log" Apr 24 22:14:59.087167 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.087140 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lfbbd_ac4d827d-3c57-468f-bb34-d01bb87a171e/kube-rbac-proxy/0.log" Apr 24 22:14:59.182187 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.182159 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kp4z8_8d58b512-cb70-43b1-ac41-1be3111f0ccc/dns-node-resolver/0.log" Apr 24 22:14:59.520997 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.520951 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" event={"ID":"36cd0907-96cd-48cc-9f05-cbc68f662f4e","Type":"ContainerStarted","Data":"50e870afc15b6e617d36b3efc68f0cb187601ec6191bc5d21d49b311eea0b810"} Apr 24 22:14:59.520997 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.521001 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" event={"ID":"36cd0907-96cd-48cc-9f05-cbc68f662f4e","Type":"ContainerStarted","Data":"17602556f7f348e1a43e5898b651aecee08776e30eec90367b21aa37eeaba921"} Apr 24 22:14:59.521374 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.521099 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:14:59.538784 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.538725 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" podStartSLOduration=1.538708195 podStartE2EDuration="1.538708195s" podCreationTimestamp="2026-04-24 22:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:59.537444805 +0000 UTC m=+2913.521659586" watchObservedRunningTime="2026-04-24 22:14:59.538708195 +0000 UTC m=+2913.522922973" Apr 24 22:14:59.745721 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:14:59.745690 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tcjn9_6fcc9f57-5d75-40c0-88a6-9f4985a693ad/node-ca/0.log" Apr 24 22:15:00.626025 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:00.625998 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-66f8f496b4-4gdhz_1b235dfc-3b04-476b-ac23-2c6473035a29/router/0.log" Apr 24 22:15:01.116768 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.116729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7mllt_01ac23ff-5ced-4c6d-b63f-bd951a6746ec/serve-healthcheck-canary/0.log" Apr 24 22:15:01.615258 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.615225 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6rdb2_48f5d05c-b88d-481a-b374-755d285e0f8f/insights-operator/0.log" Apr 24 22:15:01.616758 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.616738 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-6rdb2_48f5d05c-b88d-481a-b374-755d285e0f8f/insights-operator/1.log" Apr 24 22:15:01.707978 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.707948 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjpnr_6bed2968-018b-485a-8c19-169e6c4ebbb5/kube-rbac-proxy/0.log" Apr 24 22:15:01.733214 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.733178 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjpnr_6bed2968-018b-485a-8c19-169e6c4ebbb5/exporter/0.log" Apr 24 22:15:01.755229 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:01.755195 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjpnr_6bed2968-018b-485a-8c19-169e6c4ebbb5/extractor/0.log" Apr 24 22:15:04.366358 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:04.366323 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-798cfdf8b-9vksz_2a9f2315-f28a-4d4b-959e-97cdc222b75c/manager/0.log" Apr 24 22:15:04.388904 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:04.388874 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-4bw74_c0fc86b5-b817-4040-b25b-4a48aaabf1d2/openshift-lws-operator/0.log" Apr 24 22:15:05.534020 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:05.533992 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lj9kt/perf-node-gather-daemonset-4m46k" Apr 24 22:15:10.477873 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:10.477834 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-46xnr_fd03e3ed-1908-42a1-8d28-3bdf4b8e27be/kube-storage-version-migrator-operator/1.log" Apr 24 22:15:10.478902 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:10.478877 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-46xnr_fd03e3ed-1908-42a1-8d28-3bdf4b8e27be/kube-storage-version-migrator-operator/0.log" Apr 24 22:15:11.473644 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.473605 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hrvb_c966a75c-1583-49c7-802b-498b767cf3f6/kube-multus/0.log" Apr 24 22:15:11.682968 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.682941 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/kube-multus-additional-cni-plugins/0.log" Apr 24 22:15:11.709852 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.709822 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/egress-router-binary-copy/0.log" Apr 24 22:15:11.736538 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.736465 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/cni-plugins/0.log" Apr 24 22:15:11.759805 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.759773 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/bond-cni-plugin/0.log" Apr 24 22:15:11.784346 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.784315 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/routeoverride-cni/0.log" Apr 24 22:15:11.806739 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.806714 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/whereabouts-cni-bincopy/0.log" Apr 24 22:15:11.828790 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:11.828761 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tqs9r_995ed227-fb30-4b70-9c48-e4516dc0a85c/whereabouts-cni/0.log" Apr 24 22:15:12.206394 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:12.206365 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hf9r5_88be4377-88c5-417f-8cba-f0a7f6d5f16e/network-metrics-daemon/0.log" Apr 24 22:15:12.229185 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:12.229152 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hf9r5_88be4377-88c5-417f-8cba-f0a7f6d5f16e/kube-rbac-proxy/0.log" Apr 24 22:15:13.711425 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.711375 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/ovn-controller/0.log" Apr 24 22:15:13.740377 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.740347 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/ovn-acl-logging/0.log" Apr 24 22:15:13.757451 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.757419 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/kube-rbac-proxy-node/0.log" Apr 24 22:15:13.777483 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.777455 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:15:13.797288 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.797263 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/northd/0.log" Apr 24 22:15:13.818232 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.818203 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/nbdb/0.log" Apr 24 22:15:13.856545 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.856498 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/sbdb/0.log" Apr 24 22:15:13.978062 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:13.977978 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwmjf_be9fed09-b2c0-4c7e-a5be-a42ee2e2edfe/ovnkube-controller/0.log" Apr 24 22:15:15.028485 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:15.028451 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-lks6f_17a3c1f7-f576-4683-9638-e6eab0e8ff34/check-endpoints/0.log" Apr 24 22:15:15.080894 ip-10-0-131-237 kubenswrapper[2580]: I0424 22:15:15.080859 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hzw5v_7e98eeca-23eb-4e4c-b591-118f914a93a1/network-check-target-container/0.log"