Apr 22 17:52:51.587012 ip-10-0-142-118 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:52:51.981842 ip-10-0-142-118 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:51.981842 ip-10-0-142-118 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:52:51.981842 ip-10-0-142-118 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:51.981842 ip-10-0-142-118 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:52:51.982876 ip-10-0-142-118 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:52:51.984930 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.984814 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990434 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990454 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990458 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990461 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990464 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:51.990458 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990467 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990470 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990473 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990476 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990479 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990482 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990485 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990487 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990490 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990493 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990495 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990498 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990500 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990503 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990505 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990508 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990511 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990514 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990517 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990519 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:51.990690 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990522 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990528 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990531 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990533 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990536 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990540 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990544 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990548 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990551 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990554 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990556 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990559 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990562 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990564 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990567 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990570 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990573 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990575 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990578 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990580 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:51.991282 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990583 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990586 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990588 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990591 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990595 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990598 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990600 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990603 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990606 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990608 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990611 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990614 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990616 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990619 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990621 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990624 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990626 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990629 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990632 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990635 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:51.991802 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990637 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990641 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990645 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990649 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990653 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990657 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990661 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990666 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990669 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990671 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990674 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990677 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990679 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990684 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990687 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990689 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990694 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990699 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990702 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990706 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:51.992293 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.990709 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991125 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991131 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991133 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991136 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991139 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991142 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991145 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991147 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991150 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991153 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991156 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991159 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991161 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991165 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991169 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991173 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991175 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991179 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:51.992792 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991182 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991185 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991188 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991191 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991194 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991196 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991200 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991202 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991205 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991207 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991210 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991213 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991216 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991218 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991221 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991224 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991226 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991229 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991231 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991234 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:51.993256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991236 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991239 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991241 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991244 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991247 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991249 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991252 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991254 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991257 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991259 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991262 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991265 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991267 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991270 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991273 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991275 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991278 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991280 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991283 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991286 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:51.993769 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991288 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991291 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991294 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991296 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991299 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991302 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991305 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991307 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991310 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991312 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991315 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991317 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991320 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991322 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991325 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991329 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991331 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991334 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991337 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991339 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:51.994247 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991342 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991344 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991347 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991350 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991354 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991357 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991359 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.991362 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992884 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992895 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992903 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992908 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992913 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992916 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992921 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992926 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992930 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992933 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992936 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992940 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992943 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992946 2568 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:52:51.994781 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992949 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992952 2568 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992954 2568 flags.go:64] FLAG: --cloud-config="" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992958 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992960 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992965 2568 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992968 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992971 2568 flags.go:64] FLAG: --config-dir="" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992975 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992978 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992982 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992985 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992988 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992992 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992995 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.992998 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993001 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993004 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993007 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993012 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993015 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993018 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993020 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993024 2568 flags.go:64] FLAG: --enable-server="true" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993027 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:52:51.995349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993032 2568 flags.go:64] FLAG: --event-burst="100" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993035 2568 flags.go:64] FLAG: --event-qps="50" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993038 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993041 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993044 2568 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993048 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993051 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993054 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993057 2568 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993060 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993063 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993066 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993069 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993073 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993076 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993079 2568 flags.go:64] FLAG: --feature-gates="" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993083 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993086 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993089 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993092 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993096 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993100 2568 flags.go:64] FLAG: --help="false" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993102 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-142-118.ec2.internal" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993106 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:52:51.995994 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993109 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993112 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993115 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993119 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993121 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993124 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993127 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993131 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993134 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993137 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993140 2568 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993143 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993146 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993149 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993152 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993155 2568 flags.go:64] FLAG: --lock-file="" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993158 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993160 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993163 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993169 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993171 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993174 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993177 2568 flags.go:64] FLAG: --logging-format="text" Apr 22 17:52:51.996570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993180 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993183 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993186 2568 flags.go:64] FLAG: --manifest-url="" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993189 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993194 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993197 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993201 2568 flags.go:64] FLAG: --max-pods="110" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993204 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993207 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993210 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993213 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993216 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993219 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993222 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993231 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993234 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993237 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993241 2568 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993244 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993249 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993252 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993256 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993259 2568 flags.go:64] FLAG: --port="10250" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993262 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:52:51.997148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993266 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ebac8fb6e5bcd02b" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993269 2568 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993273 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993276 2568 flags.go:64] FLAG: --register-node="true" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993279 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993282 2568 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993285 2568 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993288 2568 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993291 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993298 2568 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993302 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993305 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993308 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993311 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993314 2568 flags.go:64] FLAG: --runonce="false" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993317 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993320 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993323 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993326 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993329 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993332 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993334 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993338 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993341 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993343 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993346 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:52:51.997723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993350 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993353 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993356 2568 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993359 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993364 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993367 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993370 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993377 2568 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993380 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993383 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993386 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993388 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993391 2568 flags.go:64] FLAG: --v="2" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993396 2568 flags.go:64] FLAG: --version="false" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993400 2568 flags.go:64] FLAG: --vmodule="" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993406 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.993409 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993524 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993528 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993531 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993534 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993537 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993540 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993543 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:51.998363 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993546 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993550 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993553 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993555 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993558 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993561 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993563 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993566 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993568 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993571 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993574 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993577 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993579 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993582 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993585 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993587 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993590 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993592 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993595 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993598 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:51.998961 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993600 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993603 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993605 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993609 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993612 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993614 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993617 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993620 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993622 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993625 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993628 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993630 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993633 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993637 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993639 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993642 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993645 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993647 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993650 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993653 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:51.999491 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993655 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993658 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993661 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993663 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993666 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993668 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993671 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993673 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993676 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993678 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993681 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993683 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993686 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993689 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993691 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993695 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993698 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993701 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993704 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993707 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:52.000004 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993709 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993712 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993714 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993718 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993722 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993745 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993750 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993754 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993758 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993761 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993764 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993767 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993770 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993772 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993775 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993779 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993782 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993785 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:52.000500 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:51.993787 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:51.994561 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.000830 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.000846 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000893 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000898 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000902 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000905 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000909 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000912 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000915 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000918 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000921 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000923 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000926 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000929 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:52.000963 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000932 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000934 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000937 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000940 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000943 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000945 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000948 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000951 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000953 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000956 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000959 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000961 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000964 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000967 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000969 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000973 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000975 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000978 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000980 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000984 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:52.001358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000987 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000989 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000992 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000994 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000997 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.000999 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001002 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001004 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001007 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001009 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001011 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001014 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001017 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001019 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001023 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001026 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001029 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001032 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001035 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001037 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:52.001865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001040 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001044 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001048 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001051 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001054 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001057 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001060 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001063 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001066 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001068 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001071 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001073 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001077 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001080 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001082 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001086 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001089 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001092 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001095 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:52.002358 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001097 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001100 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001103 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001105 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001107 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001110 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001112 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001116 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001118 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001121 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001123 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001126 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001129 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001131 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001134 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:52.002856 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.001139 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001236 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001242 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001244 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001247 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001251 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001254 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001257 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001259 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001262 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001264 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001267 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001270 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001272 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001277 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001280 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001283 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001287 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001290 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:52:52.003237 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001293 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001296 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001298 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001301 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001303 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001306 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001309 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001312 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001314 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001317 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001320 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001323 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001325 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001328 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001330 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001332 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001335 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001337 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001340 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001342 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:52:52.003746 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001345 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001347 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001350 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001352 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001354 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001357 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001360 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001363 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001366 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001368 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001371 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001374 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001376 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001379 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001381 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001383 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001386 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001389 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001391 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001394 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:52:52.004232 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001396 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001399 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001401 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001404 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001407 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001409 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001411 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001414 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001417 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001420 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001422 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001424 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001427 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001430 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001432 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001434 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001437 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001439 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001442 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001445 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:52:52.004864 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001447 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001450 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001452 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001455 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001457 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001459 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001462 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:52.001465 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.001470 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.002145 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:52:52.005356 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.004843 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:52:52.005696 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.005684 2568 server.go:1019] "Starting client certificate rotation" Apr 22 17:52:52.006118 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.006102 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:52:52.006154 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.006136 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:52:52.028031 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.028013 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:52:52.031078 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.031050 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:52:52.044375 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.044353 2568 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:52:52.049607 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.049587 2568 log.go:25] "Validated CRI v1 image API" Apr 22 17:52:52.050828 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.050814 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:52:52.055570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.055543 2568 fs.go:135] Filesystem UUIDs: map[1c70a4ce-4fd5-422d-8463-30742057328b:/dev/nvme0n1p3 48943db2-dfa5-43e5-b93e-d8b25c4548cf:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 17:52:52.055642 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.055569 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:52:52.059891 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.059873 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:52:52.061388 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.061288 2568 manager.go:217] Machine: {Timestamp:2026-04-22 17:52:52.059535457 +0000 UTC m=+0.368166955 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100918 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fc9cf6bf72047e9c359bfc1cb426b SystemUUID:ec2fc9cf-6bf7-2047-e9c3-59bfc1cb426b BootID:d3e59ef5-a8e5-42cb-aa64-0992d949a34e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b3:4b:7e:34:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b3:4b:7e:34:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:fc:b9:5b:ff:7f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:52:52.061388 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.061384 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:52:52.061494 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.061462 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:52:52.062463 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.062439 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:52:52.062597 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.062466 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-118.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:52:52.062638 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.062607 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:52:52.062638 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.062616 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:52:52.062638 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.062629 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:52:52.063428 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.063417 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:52:52.064753 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.064743 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:52:52.064856 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.064847 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:52:52.066776 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.066766 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:52:52.066816 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.066784 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:52:52.066816 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.066797 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:52:52.066816 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.066806 2568 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:52:52.066816 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.066814 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:52:52.067838 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.067827 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:52:52.067880 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.067844 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:52:52.087051 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.087026 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:52:52.090443 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.090427 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:52:52.090798 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.090778 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:52:52.090842 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.090780 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:52:52.092088 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092076 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092093 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092099 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092104 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092110 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092116 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092121 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:52:52.092130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092126 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:52:52.092307 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092133 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:52:52.092307 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092140 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:52:52.092307 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092148 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:52:52.092307 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092157 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:52:52.092942 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092933 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:52:52.092942 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.092942 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:52:52.096340 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.096326 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:52:52.096387 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.096361 2568 server.go:1295] "Started kubelet" Apr 22 17:52:52.096462 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.096442 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:52:52.096549 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.096516 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:52:52.096579 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.096564 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:52:52.097537 ip-10-0-142-118 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:52:52.097919 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.097688 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:52:52.098647 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.098636 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:52:52.105275 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.105219 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 17:52:52.105275 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.105264 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:52:52.105740 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.105709 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:52:52.106254 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106235 2568 factory.go:55] Registering systemd factory Apr 22 17:52:52.106254 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106257 2568 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:52:52.106382 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106296 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:52:52.106382 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106299 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:52:52.106382 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106319 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:52:52.106382 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106335 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-118.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106445 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106455 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106457 2568 factory.go:153] Registering CRI-O factory Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106480 2568 factory.go:223] Registration of the crio container factory successfully Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.106498 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106541 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106564 2568 factory.go:103] Registering Raw factory Apr 22 17:52:52.106594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106575 2568 manager.go:1196] Started watching for new ooms in manager Apr 22 17:52:52.107012 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.106999 2568 manager.go:319] Starting recovery of all containers Apr 22 17:52:52.108661 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.106390 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-118.ec2.internal.18a8bf48bb2a88e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-118.ec2.internal,UID:ip-10-0-142-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-118.ec2.internal,},FirstTimestamp:2026-04-22 17:52:52.096338149 +0000 UTC m=+0.404969646,LastTimestamp:2026-04-22 17:52:52.096338149 +0000 UTC m=+0.404969646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-118.ec2.internal,}" Apr 22 17:52:52.113773 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.113747 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 17:52:52.116435 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.112542 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 17:52:52.118967 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.118953 2568 manager.go:324] Recovery completed Apr 22 17:52:52.123610 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.123595 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.125782 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.125757 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.125841 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.125793 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.125841 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.125803 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.126280 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.126264 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:52:52.126280 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.126279 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:52:52.126365 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.126295 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:52:52.128326 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.128311 2568 policy_none.go:49] "None policy: Start" Apr 22 17:52:52.128412 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.128330 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:52:52.128412 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.128344 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:52:52.129404 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.127968 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-118.ec2.internal.18a8bf48bcebc9f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-118.ec2.internal,UID:ip-10-0-142-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-118.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-118.ec2.internal,},FirstTimestamp:2026-04-22 17:52:52.12578047 +0000 UTC m=+0.434411948,LastTimestamp:2026-04-22 17:52:52.12578047 +0000 UTC m=+0.434411948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-118.ec2.internal,}" Apr 22 17:52:52.130482 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.130365 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p68sf" Apr 22 17:52:52.141624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.141556 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-118.ec2.internal.18a8bf48bcec0ea3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-118.ec2.internal,UID:ip-10-0-142-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-142-118.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-142-118.ec2.internal,},FirstTimestamp:2026-04-22 17:52:52.125798051 +0000 UTC m=+0.434429530,LastTimestamp:2026-04-22 17:52:52.125798051 +0000 UTC m=+0.434429530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-118.ec2.internal,}" Apr 22 17:52:52.142244 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.142225 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p68sf" Apr 22 17:52:52.170394 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170377 2568 manager.go:341] "Starting Device Plugin manager" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.170436 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170452 2568 server.go:85] "Starting device plugin registration server" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170687 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170699 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170830 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170933 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.170943 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.171345 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:52:52.173886 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.171380 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.206511 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.206478 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:52:52.207652 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.207632 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:52:52.207778 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.207658 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:52:52.207778 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.207676 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:52:52.207778 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.207682 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:52:52.207778 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.207716 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:52:52.212460 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.212443 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:52.271116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.271090 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.272001 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.271985 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.272077 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.272017 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.272077 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.272032 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.272077 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.272068 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.282294 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.282277 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.282345 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.282298 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-118.ec2.internal\": node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.298952 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.298932 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.308010 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.307994 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal"] Apr 22 17:52:52.308058 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.308051 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.309462 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.309447 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.309536 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.309474 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.309536 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.309487 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.310593 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.310581 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.310740 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.310716 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.310807 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.310754 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.311218 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311196 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.311279 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311224 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.311279 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311239 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.311279 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311273 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.311378 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311289 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.311378 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.311302 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.312337 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.312324 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.312379 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.312355 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:52:52.313189 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.313174 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:52:52.313261 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.313205 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:52:52.313261 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.313223 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:52:52.345235 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.345217 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-118.ec2.internal\" not found" node="ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.349477 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.349459 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-118.ec2.internal\" not found" node="ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.399847 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.399827 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.406975 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.406956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.407038 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.406978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1efe39c18a96fb22c7e6fa00ec347d37-config\") pod \"kube-apiserver-proxy-ip-10-0-142-118.ec2.internal\" (UID: \"1efe39c18a96fb22c7e6fa00ec347d37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.407038 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.407000 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.500293 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.500224 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.507500 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.507570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1efe39c18a96fb22c7e6fa00ec347d37-config\") pod \"kube-apiserver-proxy-ip-10-0-142-118.ec2.internal\" (UID: \"1efe39c18a96fb22c7e6fa00ec347d37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.507570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.507570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.507690 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507582 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1efe39c18a96fb22c7e6fa00ec347d37-config\") pod \"kube-apiserver-proxy-ip-10-0-142-118.ec2.internal\" (UID: \"1efe39c18a96fb22c7e6fa00ec347d37\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.507690 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.507588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93fc1b8dfb74ea02eb01ffea326fa5a0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal\" (UID: \"93fc1b8dfb74ea02eb01ffea326fa5a0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.600986 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.600946 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.647436 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.647404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.651870 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:52.651857 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:52.701601 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.701575 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.802117 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.802060 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:52.902670 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:52.902642 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:53.003234 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:53.003200 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:53.005356 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.005339 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:52:53.005485 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.005470 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:52:53.094223 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.094147 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:53.103418 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:53.103392 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-118.ec2.internal\" not found" Apr 22 17:52:53.105553 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.105540 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:52:53.117980 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.117955 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:52:53.144950 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.144922 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:47:52 +0000 UTC" deadline="2027-12-30 08:36:18.429836828 +0000 UTC" Apr 22 17:52:53.144950 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.144947 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14798h43m25.284892073s" Apr 22 17:52:53.159044 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.159019 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6bfdd" Apr 22 17:52:53.167112 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.167088 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6bfdd" Apr 22 17:52:53.175070 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.175053 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:53.206264 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.206245 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" Apr 22 17:52:53.216355 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.216337 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:52:53.217415 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.217395 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" Apr 22 17:52:53.229023 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.229006 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:52:53.276124 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:53.276091 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1efe39c18a96fb22c7e6fa00ec347d37.slice/crio-285b0317111295d8a4e171156f85f6984490d9f19fc9a7e61f2ae805c3153b31 WatchSource:0}: Error finding container 285b0317111295d8a4e171156f85f6984490d9f19fc9a7e61f2ae805c3153b31: Status 404 returned error can't find the container with id 285b0317111295d8a4e171156f85f6984490d9f19fc9a7e61f2ae805c3153b31 Apr 22 17:52:53.279741 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.279716 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:52:53.301380 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:53.301353 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93fc1b8dfb74ea02eb01ffea326fa5a0.slice/crio-64f9ce7deac95f58d14c8034417ddd374395a544c24f6f83b71ab81ff127ff0d WatchSource:0}: Error finding container 64f9ce7deac95f58d14c8034417ddd374395a544c24f6f83b71ab81ff127ff0d: Status 404 returned error can't find the container with id 64f9ce7deac95f58d14c8034417ddd374395a544c24f6f83b71ab81ff127ff0d Apr 22 17:52:53.489655 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:53.489579 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:54.068405 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.068375 2568 apiserver.go:52] "Watching apiserver" Apr 22 17:52:54.078749 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.078683 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:52:54.079697 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.079673 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fvkcv","kube-system/konnectivity-agent-ql9lr","kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal","openshift-image-registry/node-ca-tjtfp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal","openshift-multus/multus-4ljm6","openshift-multus/network-metrics-daemon-k7kpf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x","openshift-cluster-node-tuning-operator/tuned-ccl29","openshift-dns/node-resolver-sbk9w","openshift-multus/multus-additional-cni-plugins-s7s7v","openshift-network-diagnostics/network-check-target-4phwt","openshift-network-operator/iptables-alerter-h4knh"] Apr 22 17:52:54.085276 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.085248 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.085405 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.085378 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:52:54.087290 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.087263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.089812 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.089792 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fhsjw\"" Apr 22 17:52:54.089904 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.089821 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:52:54.089959 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.089908 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.090105 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.090090 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:52:54.092527 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092161 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-589vt\"" Apr 22 17:52:54.092527 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092248 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.092527 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092322 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:52:54.092721 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092647 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:52:54.092721 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092693 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.092721 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.092651 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094473 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ghrmk\"" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094477 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094556 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094695 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094804 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.095050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.094861 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:52:54.096931 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.096911 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:52:54.097036 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.096943 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.098595 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.098574 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.098595 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.098590 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fb6nv\"" Apr 22 17:52:54.098762 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.098610 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:52:54.098762 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.098628 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.099047 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.098989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.099230 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099213 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-99lft\"" Apr 22 17:52:54.099305 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099223 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:52:54.099305 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099245 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:52:54.099305 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099261 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.099305 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:52:54.100001 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.099983 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.100089 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.100003 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.100263 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.100244 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.100849 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.100819 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-94bz7\"" Apr 22 17:52:54.102366 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.102297 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.102662 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.102519 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sg8sc\"" Apr 22 17:52:54.102662 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.102542 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.102823 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.102801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.105272 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.105253 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:52:54.105363 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.105308 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-556xx\"" Apr 22 17:52:54.105417 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.105253 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:52:54.108585 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.107792 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.108585 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.108188 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:54.108585 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.108266 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:52:54.112288 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.110385 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:52:54.112288 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.110674 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hqmcr\"" Apr 22 17:52:54.112288 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.111106 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:52:54.112288 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.111418 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:52:54.116310 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.116406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.116406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-registration-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.116406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.116406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116425 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-multus-certs\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-netd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqm7n\" (UniqueName: \"kubernetes.io/projected/b920c1ec-1c95-459e-a9cf-a36565ac5b48-kube-api-access-tqm7n\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-modprobe-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-kubelet\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116598 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-systemd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116628 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cni-binary-copy\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-k8s-cni-cncf-io\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-hostroot\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-daemon-config\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-config\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116814 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-var-lib-kubelet\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-sys-fs\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-konnectivity-ca\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-system-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.116963 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.116967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-os-release\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-os-release\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-conf-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-slash\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117340 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-netns\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-run\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46lh\" (UniqueName: \"kubernetes.io/projected/8718afb8-70e3-49cc-879c-d5cc4a081622-kube-api-access-h46lh\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-etc-kubernetes\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117448 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-ovn\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117460 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-hosts-file\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r27j8\" (UniqueName: \"kubernetes.io/projected/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-kube-api-access-r27j8\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-netns\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-926mc\" (UniqueName: \"kubernetes.io/projected/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kube-api-access-926mc\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-624lx\" (UniqueName: \"kubernetes.io/projected/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-kube-api-access-624lx\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-systemd-units\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117660 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-var-lib-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-log-socket\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-kubernetes\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117750 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-conf\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117773 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttcjs\" (UniqueName: \"kubernetes.io/projected/8c0ae7fd-c205-4928-b51f-9f80202d3f77-kube-api-access-ttcjs\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-device-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.117878 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-socket-dir-parent\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-bin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-multus\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-script-lib\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.117985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-systemd\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-lib-modules\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118107 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysconfig\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118223 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-tuned\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cnibin\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118299 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-node-log\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c0ae7fd-c205-4928-b51f-9f80202d3f77-host\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118352 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cnibin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-bin\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-sys\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-system-cni-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118453 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.118582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118475 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c0ae7fd-c205-4928-b51f-9f80202d3f77-serviceca\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-socket-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118525 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovn-node-metrics-cert\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118577 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-binary-copy\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-host-slash\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-agent-certs\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-kubelet\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-etc-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118689 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-tmp-dir\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118724 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndfb\" (UniqueName: \"kubernetes.io/projected/e29ab8a7-8881-4951-93eb-55d0b996dbcb-kube-api-access-xndfb\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-iptables-alerter-script\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2sd\" (UniqueName: \"kubernetes.io/projected/ab99124f-2959-4b17-ab76-24041f074fe5-kube-api-access-cx2sd\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118844 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkjp\" (UniqueName: \"kubernetes.io/projected/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-kube-api-access-xwkjp\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.119413 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118891 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.120022 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-env-overrides\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.120022 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118936 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-host\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.120022 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.118958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-tmp\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.168608 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.168575 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:47:53 +0000 UTC" deadline="2028-01-17 06:40:30.824249692 +0000 UTC" Apr 22 17:52:54.168608 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.168606 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15228h47m36.655647162s" Apr 22 17:52:54.207570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.207542 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:52:54.212609 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.212558 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" event={"ID":"93fc1b8dfb74ea02eb01ffea326fa5a0","Type":"ContainerStarted","Data":"64f9ce7deac95f58d14c8034417ddd374395a544c24f6f83b71ab81ff127ff0d"} Apr 22 17:52:54.213702 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.213670 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" event={"ID":"1efe39c18a96fb22c7e6fa00ec347d37","Type":"ContainerStarted","Data":"285b0317111295d8a4e171156f85f6984490d9f19fc9a7e61f2ae805c3153b31"} Apr 22 17:52:54.219143 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-multus\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.219253 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-script-lib\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.219253 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-systemd\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219253 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-lib-modules\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219253 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219215 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysconfig\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219253 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-tuned\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-multus\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-systemd\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cnibin\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219325 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cnibin\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-node-log\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c0ae7fd-c205-4928-b51f-9f80202d3f77-host\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cnibin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-bin\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219429 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-lib-modules\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-sys\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-sys\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.219487 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-system-cni-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219509 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-node-log\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219511 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c0ae7fd-c205-4928-b51f-9f80202d3f77-serviceca\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-socket-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219590 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovn-node-metrics-cert\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysconfig\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-binary-copy\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219686 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-host-slash\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219688 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-system-cni-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c0ae7fd-c205-4928-b51f-9f80202d3f77-host\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-script-lib\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219854 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-bin\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-socket-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.220090 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219896 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-agent-certs\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219936 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-kubelet\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.219604 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-etc-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219988 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-tmp-dir\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.220020 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:52:54.72000257 +0000 UTC m=+3.028634047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-kubelet\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-etc-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c0ae7fd-c205-4928-b51f-9f80202d3f77-serviceca\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.219811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cnibin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-host-slash\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-tmp-dir\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220506 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220542 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xndfb\" (UniqueName: \"kubernetes.io/projected/e29ab8a7-8881-4951-93eb-55d0b996dbcb-kube-api-access-xndfb\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-iptables-alerter-script\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2sd\" (UniqueName: \"kubernetes.io/projected/ab99124f-2959-4b17-ab76-24041f074fe5-kube-api-access-cx2sd\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.220948 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwkjp\" (UniqueName: \"kubernetes.io/projected/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-kube-api-access-xwkjp\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-env-overrides\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220773 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-host\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-host\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-tmp\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220889 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-binary-copy\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-registration-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.220999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221025 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-multus-certs\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221112 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-multus-certs\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221171 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.221718 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.222517 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.221348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.222517 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.222150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-env-overrides\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223390 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-iptables-alerter-script\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.223518 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223574 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-netd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223623 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqm7n\" (UniqueName: \"kubernetes.io/projected/b920c1ec-1c95-459e-a9cf-a36565ac5b48-kube-api-access-tqm7n\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223623 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223618 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-modprobe-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.223723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-kubelet\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-systemd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223723 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cni-binary-copy\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.223894 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223761 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-k8s-cni-cncf-io\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.223894 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-hostroot\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.223894 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-daemon-config\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.223894 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-config\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.223894 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223878 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-var-lib-kubelet\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-sys-fs\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-konnectivity-ca\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223938 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e29ab8a7-8881-4951-93eb-55d0b996dbcb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.223966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-system-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-os-release\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224071 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-tmp\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.224116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-os-release\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-conf-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-slash\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-netns\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224216 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-run\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h46lh\" (UniqueName: \"kubernetes.io/projected/8718afb8-70e3-49cc-879c-d5cc4a081622-kube-api-access-h46lh\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-etc-kubernetes\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-modprobe-d\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-ovn\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.224466 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovn-node-metrics-cert\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.225071 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.224022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-cni-netd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.225173 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225293 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-hosts-file\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r27j8\" (UniqueName: \"kubernetes.io/projected/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-kube-api-access-r27j8\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225349 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-kubelet\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-netns\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-systemd\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-926mc\" (UniqueName: \"kubernetes.io/projected/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kube-api-access-926mc\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-624lx\" (UniqueName: \"kubernetes.io/projected/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-kube-api-access-624lx\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225496 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-system-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-cni-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225664 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e29ab8a7-8881-4951-93eb-55d0b996dbcb-os-release\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225677 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-systemd-units\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225715 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-var-lib-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225723 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-os-release\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225753 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-hosts-file\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-var-lib-openvswitch\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225824 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-log-socket\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225859 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-kubernetes\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.226774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-tuned\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-conf\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-run-ovn\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225932 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttcjs\" (UniqueName: \"kubernetes.io/projected/8c0ae7fd-c205-4928-b51f-9f80202d3f77-kube-api-access-ttcjs\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-netns\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.225967 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-device-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-etc-selinux\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226036 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-cni-binary-copy\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226031 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-device-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-systemd-units\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-kubernetes\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-log-socket\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226190 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-etc-sysctl-conf\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226194 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-conf-dir\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226219 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-socket-dir-parent\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-bin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-slash\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-run\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228144 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8718afb8-70e3-49cc-879c-d5cc4a081622-var-lib-kubelet\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b920c1ec-1c95-459e-a9cf-a36565ac5b48-host-run-netns\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-etc-kubernetes\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-daemon-config\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-sys-fs\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.226852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b920c1ec-1c95-459e-a9cf-a36565ac5b48-ovnkube-config\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227176 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-run-k8s-cni-cncf-io\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-hostroot\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227271 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-konnectivity-ca\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-host-var-lib-cni-bin\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227384 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-multus-socket-dir-parent\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.228848 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.227491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-registration-dir\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.229374 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.229115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1bfe7678-f24d-4f1f-81a3-b65e7179ae30-agent-certs\") pod \"konnectivity-agent-ql9lr\" (UID: \"1bfe7678-f24d-4f1f-81a3-b65e7179ae30\") " pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.229939 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.229917 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2sd\" (UniqueName: \"kubernetes.io/projected/ab99124f-2959-4b17-ab76-24041f074fe5-kube-api-access-cx2sd\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.231492 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.231468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndfb\" (UniqueName: \"kubernetes.io/projected/e29ab8a7-8881-4951-93eb-55d0b996dbcb-kube-api-access-xndfb\") pod \"multus-additional-cni-plugins-s7s7v\" (UID: \"e29ab8a7-8881-4951-93eb-55d0b996dbcb\") " pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.232107 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.232056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqm7n\" (UniqueName: \"kubernetes.io/projected/b920c1ec-1c95-459e-a9cf-a36565ac5b48-kube-api-access-tqm7n\") pod \"ovnkube-node-fvkcv\" (UID: \"b920c1ec-1c95-459e-a9cf-a36565ac5b48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.232457 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.232361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwkjp\" (UniqueName: \"kubernetes.io/projected/d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd-kube-api-access-xwkjp\") pod \"iptables-alerter-h4knh\" (UID: \"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd\") " pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.236361 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.236202 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:54.236361 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.236225 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:54.236361 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.236239 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:54.236361 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.236304 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:52:54.736287162 +0000 UTC m=+3.044918629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:54.237395 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.237369 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-624lx\" (UniqueName: \"kubernetes.io/projected/0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3-kube-api-access-624lx\") pod \"multus-4ljm6\" (UID: \"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3\") " pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.237485 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.237411 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttcjs\" (UniqueName: \"kubernetes.io/projected/8c0ae7fd-c205-4928-b51f-9f80202d3f77-kube-api-access-ttcjs\") pod \"node-ca-tjtfp\" (UID: \"8c0ae7fd-c205-4928-b51f-9f80202d3f77\") " pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.237679 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.237656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-926mc\" (UniqueName: \"kubernetes.io/projected/806c5b6e-2a45-4930-8e2d-7acdcc9590f9-kube-api-access-926mc\") pod \"aws-ebs-csi-driver-node-zm29x\" (UID: \"806c5b6e-2a45-4930-8e2d-7acdcc9590f9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.238382 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.238359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r27j8\" (UniqueName: \"kubernetes.io/projected/3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca-kube-api-access-r27j8\") pod \"node-resolver-sbk9w\" (UID: \"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca\") " pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.238930 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.238913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46lh\" (UniqueName: \"kubernetes.io/projected/8718afb8-70e3-49cc-879c-d5cc4a081622-kube-api-access-h46lh\") pod \"tuned-ccl29\" (UID: \"8718afb8-70e3-49cc-879c-d5cc4a081622\") " pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.400037 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.399967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:52:54.410891 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.410862 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4ljm6" Apr 22 17:52:54.421547 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.421523 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tjtfp" Apr 22 17:52:54.427455 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.427434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" Apr 22 17:52:54.435023 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.435003 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:52:54.443577 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.443559 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ccl29" Apr 22 17:52:54.451062 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.451046 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sbk9w" Apr 22 17:52:54.459955 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.459937 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" Apr 22 17:52:54.466534 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.466513 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h4knh" Apr 22 17:52:54.473538 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.473519 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:52:54.728711 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.728629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:54.728868 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.728744 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:54.728868 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.728855 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:52:55.728835817 +0000 UTC m=+4.037467288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:54.829477 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:54.829438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:54.829646 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.829601 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:54.829646 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.829622 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:54.829646 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.829634 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:54.829796 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:54.829686 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:52:55.82967031 +0000 UTC m=+4.138301775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:54.943024 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:54.942992 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd9542b_a42c_4dbd_a379_4f7eea0a1ca3.slice/crio-3df8bf421a1a2bad5c78aa2b097a756ef0809913d5149dec9a5d9e590eca218c WatchSource:0}: Error finding container 3df8bf421a1a2bad5c78aa2b097a756ef0809913d5149dec9a5d9e590eca218c: Status 404 returned error can't find the container with id 3df8bf421a1a2bad5c78aa2b097a756ef0809913d5149dec9a5d9e590eca218c Apr 22 17:52:54.944902 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:54.944872 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb920c1ec_1c95_459e_a9cf_a36565ac5b48.slice/crio-c0ef9da3917454c7ee0e55ef6c6c9820079c5738054b10724e1682a36f277fec WatchSource:0}: Error finding container c0ef9da3917454c7ee0e55ef6c6c9820079c5738054b10724e1682a36f277fec: Status 404 returned error can't find the container with id c0ef9da3917454c7ee0e55ef6c6c9820079c5738054b10724e1682a36f277fec Apr 22 17:52:54.947518 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:54.947487 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode29ab8a7_8881_4951_93eb_55d0b996dbcb.slice/crio-1a0260aeffab884f3ee05a2121b090eceaef134c5c03dbc5386b92f2780c3874 WatchSource:0}: Error finding container 1a0260aeffab884f3ee05a2121b090eceaef134c5c03dbc5386b92f2780c3874: Status 404 returned error can't find the container with id 1a0260aeffab884f3ee05a2121b090eceaef134c5c03dbc5386b92f2780c3874 Apr 22 17:52:54.950168 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:52:54.950037 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1e6d5b7_a3d0_4a7a_965b_c59191a9dbfd.slice/crio-0e7da10494d38fcb33950f2149acf8d94b361acad216901e6e2d3a700a41e588 WatchSource:0}: Error finding container 0e7da10494d38fcb33950f2149acf8d94b361acad216901e6e2d3a700a41e588: Status 404 returned error can't find the container with id 0e7da10494d38fcb33950f2149acf8d94b361acad216901e6e2d3a700a41e588 Apr 22 17:52:55.169606 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.169419 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:47:53 +0000 UTC" deadline="2027-09-18 16:57:21.698760075 +0000 UTC" Apr 22 17:52:55.169606 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.169601 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12335h4m26.529161966s" Apr 22 17:52:55.208842 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.208816 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:55.208998 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.208936 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:52:55.215903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.215875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ql9lr" event={"ID":"1bfe7678-f24d-4f1f-81a3-b65e7179ae30","Type":"ContainerStarted","Data":"100c85ebe5a606233c2e2bb8e9231185005c2b7ba456ef6174fbe78660ae87e8"} Apr 22 17:52:55.216847 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.216823 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4ljm6" event={"ID":"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3","Type":"ContainerStarted","Data":"3df8bf421a1a2bad5c78aa2b097a756ef0809913d5149dec9a5d9e590eca218c"} Apr 22 17:52:55.218394 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.218374 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" event={"ID":"1efe39c18a96fb22c7e6fa00ec347d37","Type":"ContainerStarted","Data":"44432be7d653ca1bb2203274623255aa120a9ade6e326fa8ca08a89ecf7a3648"} Apr 22 17:52:55.219442 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.219422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sbk9w" event={"ID":"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca","Type":"ContainerStarted","Data":"bdc10d614f981be5a76d5d0558bd4e1e02b543609a764342875e8e2d1a4348f3"} Apr 22 17:52:55.220431 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.220400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tjtfp" event={"ID":"8c0ae7fd-c205-4928-b51f-9f80202d3f77","Type":"ContainerStarted","Data":"062e3322e49f33449a7af799b0809eec433711a2700b2979f1f07bfe6d61a6a7"} Apr 22 17:52:55.221282 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.221258 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h4knh" event={"ID":"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd","Type":"ContainerStarted","Data":"0e7da10494d38fcb33950f2149acf8d94b361acad216901e6e2d3a700a41e588"} Apr 22 17:52:55.222314 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.222296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ccl29" event={"ID":"8718afb8-70e3-49cc-879c-d5cc4a081622","Type":"ContainerStarted","Data":"3a1410450be0118f6b9cb25ef6c2573ef66cfc0b04ec9d1aa38bae9be859db27"} Apr 22 17:52:55.225869 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.225840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerStarted","Data":"1a0260aeffab884f3ee05a2121b090eceaef134c5c03dbc5386b92f2780c3874"} Apr 22 17:52:55.226847 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.226822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"c0ef9da3917454c7ee0e55ef6c6c9820079c5738054b10724e1682a36f277fec"} Apr 22 17:52:55.227716 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.227686 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" event={"ID":"806c5b6e-2a45-4930-8e2d-7acdcc9590f9","Type":"ContainerStarted","Data":"51b8392dfc5d73df60520346ab79667d4d3be229dd97059aa47a92acb2680687"} Apr 22 17:52:55.233048 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.232950 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-118.ec2.internal" podStartSLOduration=2.232936665 podStartE2EDuration="2.232936665s" podCreationTimestamp="2026-04-22 17:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:52:55.232871731 +0000 UTC m=+3.541503218" watchObservedRunningTime="2026-04-22 17:52:55.232936665 +0000 UTC m=+3.541568153" Apr 22 17:52:55.735641 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.735596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:55.735812 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.735784 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:55.735893 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.735849 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:52:57.735829253 +0000 UTC m=+6.044460734 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:55.836325 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:55.836268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:55.836462 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.836413 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:55.836462 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.836433 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:55.836462 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.836443 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:55.836622 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:55.836490 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:52:57.836476054 +0000 UTC m=+6.145107519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:56.209276 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:56.208957 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:56.209276 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:56.209102 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:52:56.245000 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:56.244366 2568 generic.go:358] "Generic (PLEG): container finished" podID="93fc1b8dfb74ea02eb01ffea326fa5a0" containerID="0777823064ed45465c8b6d2f2fc2f021e8bf7b87da143c2aa58d979e2cc2ec8a" exitCode=0 Apr 22 17:52:56.245000 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:56.244788 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" event={"ID":"93fc1b8dfb74ea02eb01ffea326fa5a0","Type":"ContainerDied","Data":"0777823064ed45465c8b6d2f2fc2f021e8bf7b87da143c2aa58d979e2cc2ec8a"} Apr 22 17:52:57.208709 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:57.208185 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:57.208709 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.208346 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:52:57.258135 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:57.258097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" event={"ID":"93fc1b8dfb74ea02eb01ffea326fa5a0","Type":"ContainerStarted","Data":"bd0ec98efcd34dd1fb5e865e86a7115019cc037f52d3849d1bf5f89c93603caf"} Apr 22 17:52:57.272277 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:57.272228 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-118.ec2.internal" podStartSLOduration=4.272210268 podStartE2EDuration="4.272210268s" podCreationTimestamp="2026-04-22 17:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:52:57.271332429 +0000 UTC m=+5.579963917" watchObservedRunningTime="2026-04-22 17:52:57.272210268 +0000 UTC m=+5.580841761" Apr 22 17:52:57.752906 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:57.752284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:57.752906 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.752475 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:57.752906 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.752540 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:01.752521278 +0000 UTC m=+10.061152744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:52:57.853445 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:57.852791 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:57.853445 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.852965 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:52:57.853445 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.852983 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:52:57.853445 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.853005 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:57.853445 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:57.853064 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:01.853045552 +0000 UTC m=+10.161677036 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:52:58.209381 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:58.208911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:52:58.209381 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:58.209040 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:52:59.208745 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:52:59.208701 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:52:59.209196 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:52:59.208856 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:00.209050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.208691 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:00.209050 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:00.208841 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:00.771168 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.771135 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ldvlp"] Apr 22 17:53:00.778011 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.777985 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.778168 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:00.778053 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:00.883513 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.883471 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-dbus\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.883710 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.883568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-kubelet-config\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.883710 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.883597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984561 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.984508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-dbus\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984720 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.984605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-kubelet-config\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984720 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.984637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984720 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.984695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-dbus\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984914 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:00.984788 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:00.984914 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:00.984793 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0289f618-f4aa-4688-a261-c755d1a71444-kubelet-config\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:00.984914 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:00.984843 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:01.484829288 +0000 UTC m=+9.793460753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:01.208579 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:01.208495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:01.208785 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.208651 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:01.489404 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:01.489336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:01.489785 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.489493 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:01.489785 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.489557 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:02.489538815 +0000 UTC m=+10.798170282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:01.791790 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:01.791701 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:01.791960 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.791890 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:01.791960 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.791957 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:09.791935985 +0000 UTC m=+18.100567453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:01.892544 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:01.892503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:01.892756 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.892703 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:01.892756 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.892724 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:01.892756 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.892754 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:01.892926 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:01.892819 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:09.892801071 +0000 UTC m=+18.201432543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:02.208841 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:02.208766 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:02.209006 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:02.208880 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:02.209421 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:02.209286 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:02.209421 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:02.209385 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:02.498073 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:02.497448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:02.498073 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:02.497622 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:02.498073 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:02.497680 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:04.497662063 +0000 UTC m=+12.806293534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:03.208617 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:03.208580 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:03.208815 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:03.208741 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:04.208665 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:04.208592 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:04.209094 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:04.208593 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:04.209094 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:04.208702 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:04.209094 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:04.208833 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:04.513369 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:04.513280 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:04.513530 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:04.513405 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:04.513530 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:04.513458 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:08.513445234 +0000 UTC m=+16.822076698 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:05.208157 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:05.208127 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:05.208320 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:05.208253 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:06.208434 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:06.208399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:06.208925 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:06.208399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:06.208925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:06.208522 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:06.208925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:06.208620 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:07.208495 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:07.208456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:07.208977 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:07.208593 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:08.208508 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:08.208468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:08.208951 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:08.208522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:08.208951 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:08.208597 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:08.208951 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:08.208743 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:08.544739 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:08.544646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:08.544891 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:08.544814 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:08.544891 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:08.544873 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:16.544856854 +0000 UTC m=+24.853488321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:09.208719 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:09.208679 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:09.209140 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.208809 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:09.853257 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:09.853220 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:09.853565 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.853376 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:09.853565 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.853456 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.853435737 +0000 UTC m=+34.162067202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:09.953824 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:09.953784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:09.953993 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.953967 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:09.953993 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.953991 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:09.954108 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.954004 2568 projected.go:194] Error preparing data for projected volume kube-api-access-bjpp8 for pod openshift-network-diagnostics/network-check-target-4phwt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:09.954108 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:09.954069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8 podName:d950d834-86a0-437a-b1c6-30e88678d30b nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.954050655 +0000 UTC m=+34.262682137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjpp8" (UniqueName: "kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8") pod "network-check-target-4phwt" (UID: "d950d834-86a0-437a-b1c6-30e88678d30b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:10.208743 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:10.208640 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:10.209100 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:10.208644 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:10.209100 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:10.208791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:10.209100 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:10.208855 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:11.208742 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:11.208698 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:11.209156 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:11.208836 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:12.209305 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.209142 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:12.209844 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.209203 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:12.209844 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:12.209356 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:12.209844 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:12.209435 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:12.283415 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.283385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sbk9w" event={"ID":"3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca","Type":"ContainerStarted","Data":"2d9a6d06cbc5d27bca4596c41340ffe6fe30be4e2d1bfe19f30884670635303c"} Apr 22 17:53:12.284600 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.284573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tjtfp" event={"ID":"8c0ae7fd-c205-4928-b51f-9f80202d3f77","Type":"ContainerStarted","Data":"0b5adefbd167927efcaa4c0b2fccc740bb09710f61eecba0e2a8254c10c7ddd6"} Apr 22 17:53:12.285687 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.285663 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ccl29" event={"ID":"8718afb8-70e3-49cc-879c-d5cc4a081622","Type":"ContainerStarted","Data":"22a6446992d4ff159c31cd9a4c9f8174714aea38de3d3d25cf1b3bd548d06adc"} Apr 22 17:53:12.286796 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.286775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerStarted","Data":"3dfa383c2439dd6ae1b614cd4c3cf7057ca66f1a80372d4e239d16193607785c"} Apr 22 17:53:12.287971 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.287948 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" event={"ID":"806c5b6e-2a45-4930-8e2d-7acdcc9590f9","Type":"ContainerStarted","Data":"c165d98cb51ab1e5d7e0a59d104ac6749ac0526562ebe3f6077bfa2d7b665769"} Apr 22 17:53:12.289057 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.289034 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ql9lr" event={"ID":"1bfe7678-f24d-4f1f-81a3-b65e7179ae30","Type":"ContainerStarted","Data":"8f621596f8cdf3fa9b51127709a9fac174f00103e8ae89384c82209c696a26ae"} Apr 22 17:53:12.290260 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.290241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4ljm6" event={"ID":"0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3","Type":"ContainerStarted","Data":"77d691ad5674887ca6b883172afa2ec1c87931f3e92f278f1caf18185e1f6ccb"} Apr 22 17:53:12.329002 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.328936 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tjtfp" podStartSLOduration=7.906912178 podStartE2EDuration="20.32891785s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.95561522 +0000 UTC m=+3.264246685" lastFinishedPulling="2026-04-22 17:53:07.377620877 +0000 UTC m=+15.686252357" observedRunningTime="2026-04-22 17:53:12.328591305 +0000 UTC m=+20.637222791" watchObservedRunningTime="2026-04-22 17:53:12.32891785 +0000 UTC m=+20.637549405" Apr 22 17:53:12.329493 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.329454 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sbk9w" podStartSLOduration=3.488508104 podStartE2EDuration="20.329441416s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.954056065 +0000 UTC m=+3.262687540" lastFinishedPulling="2026-04-22 17:53:11.794989376 +0000 UTC m=+20.103620852" observedRunningTime="2026-04-22 17:53:12.308716923 +0000 UTC m=+20.617348411" watchObservedRunningTime="2026-04-22 17:53:12.329441416 +0000 UTC m=+20.638072914" Apr 22 17:53:12.358611 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.358558 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ccl29" podStartSLOduration=3.461723218 podStartE2EDuration="20.358539619s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.951125615 +0000 UTC m=+3.259757079" lastFinishedPulling="2026-04-22 17:53:11.847942011 +0000 UTC m=+20.156573480" observedRunningTime="2026-04-22 17:53:12.357702853 +0000 UTC m=+20.666334340" watchObservedRunningTime="2026-04-22 17:53:12.358539619 +0000 UTC m=+20.667171106" Apr 22 17:53:12.452991 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.452947 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ql9lr" podStartSLOduration=3.560451981 podStartE2EDuration="20.452931599s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.955346166 +0000 UTC m=+3.263977637" lastFinishedPulling="2026-04-22 17:53:11.847825782 +0000 UTC m=+20.156457255" observedRunningTime="2026-04-22 17:53:12.42251464 +0000 UTC m=+20.731146126" watchObservedRunningTime="2026-04-22 17:53:12.452931599 +0000 UTC m=+20.761563087" Apr 22 17:53:12.453130 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:12.453096 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4ljm6" podStartSLOduration=3.542113205 podStartE2EDuration="20.453090729s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.945439646 +0000 UTC m=+3.254071124" lastFinishedPulling="2026-04-22 17:53:11.856417179 +0000 UTC m=+20.165048648" observedRunningTime="2026-04-22 17:53:12.452338968 +0000 UTC m=+20.760970454" watchObservedRunningTime="2026-04-22 17:53:12.453090729 +0000 UTC m=+20.761722245" Apr 22 17:53:13.208946 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.208713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:13.209084 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:13.209005 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:13.293317 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.293271 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h4knh" event={"ID":"d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd","Type":"ContainerStarted","Data":"29d5f4d47428d6ca31c90a08178fdb9b5d56005bffa053bfb8ccc24fc30d9279"} Apr 22 17:53:13.294925 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.294897 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="3dfa383c2439dd6ae1b614cd4c3cf7057ca66f1a80372d4e239d16193607785c" exitCode=0 Apr 22 17:53:13.295047 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.294964 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"3dfa383c2439dd6ae1b614cd4c3cf7057ca66f1a80372d4e239d16193607785c"} Apr 22 17:53:13.297881 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"3b254bf4d1a36f44326c0ed8384807d2d5c0e713ac0779ec33e78c7a4e056f99"} Apr 22 17:53:13.297881 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"23e7738d465face0ce450237b413a81e5335a98b3e33602f498ea9daa7fc7a79"} Apr 22 17:53:13.297881 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"1f456635063acafb131745955274a1e4a31645a9bc0596c2c483327815d06899"} Apr 22 17:53:13.297881 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"940db108b324e5a164ae97dc4c9b2cff2ca65aaee3c4b73c2a28516a269cebe6"} Apr 22 17:53:13.297881 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297877 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"899e804d1fe9c7cc564862af3286a5f4be46baa5838b6fc50ad283d4720a01f6"} Apr 22 17:53:13.298179 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.297889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"e6a15774c26ed8dadeaeb4d30067eca5a96d2004e2fd77d716ef279bed8aa2a8"} Apr 22 17:53:13.312973 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.312928 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h4knh" podStartSLOduration=4.42050545 podStartE2EDuration="21.31291339s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.955413414 +0000 UTC m=+3.264044894" lastFinishedPulling="2026-04-22 17:53:11.84782137 +0000 UTC m=+20.156452834" observedRunningTime="2026-04-22 17:53:13.311455871 +0000 UTC m=+21.620087358" watchObservedRunningTime="2026-04-22 17:53:13.31291339 +0000 UTC m=+21.621544880" Apr 22 17:53:13.498054 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:13.498034 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:53:14.182222 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.182076 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:13.498049457Z","UUID":"073cda33-6247-45e6-b9c9-d80d5f738ca2","Handler":null,"Name":"","Endpoint":""} Apr 22 17:53:14.184819 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.184290 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:53:14.184819 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.184340 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:53:14.208510 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.208484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:14.208660 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.208520 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:14.208660 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:14.208604 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:14.208786 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:14.208686 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:14.301961 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:14.301919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" event={"ID":"806c5b6e-2a45-4930-8e2d-7acdcc9590f9","Type":"ContainerStarted","Data":"e6c20567947c5c04825b91d33ed46ba8b12686d1334891d8c3a4184298d2c66e"} Apr 22 17:53:15.209106 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:15.208861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:15.209262 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:15.209172 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:15.307539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:15.307504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"87ae2860aec92c19a6a12d843c2ff33f593282b7f573c669297bbbfdd2be79d8"} Apr 22 17:53:15.309492 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:15.309463 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" event={"ID":"806c5b6e-2a45-4930-8e2d-7acdcc9590f9","Type":"ContainerStarted","Data":"278e21ef6e002e2f0c94362cc0dd19e37cbd28cb53b72f711801376c8f865e78"} Apr 22 17:53:15.328508 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:15.328447 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zm29x" podStartSLOduration=3.960702264 podStartE2EDuration="23.328431962s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.956473277 +0000 UTC m=+3.265104742" lastFinishedPulling="2026-04-22 17:53:14.324202973 +0000 UTC m=+22.632834440" observedRunningTime="2026-04-22 17:53:15.327702136 +0000 UTC m=+23.636333623" watchObservedRunningTime="2026-04-22 17:53:15.328431962 +0000 UTC m=+23.637063449" Apr 22 17:53:15.396846 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:15.396813 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:53:16.027899 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.027856 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:53:16.028746 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.028711 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:53:16.208141 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.208104 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:16.208141 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.208121 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:16.208381 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:16.208236 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:16.208381 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:16.208357 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:16.312174 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.312088 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ql9lr" Apr 22 17:53:16.605513 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:16.605433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:16.605675 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:16.605595 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:16.605763 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:16.605687 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret podName:0289f618-f4aa-4688-a261-c755d1a71444 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.605667064 +0000 UTC m=+40.914298544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret") pod "global-pull-secret-syncer-ldvlp" (UID: "0289f618-f4aa-4688-a261-c755d1a71444") : object "kube-system"/"original-pull-secret" not registered Apr 22 17:53:17.208236 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:17.208159 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:17.208391 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:17.208292 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:18.208025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.207861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:18.208605 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:18.208086 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:18.208605 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.207861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:18.208605 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:18.208145 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:18.315877 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.315802 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="2da95f18ca20e19579a610c2cfcdcf0649bb348e176b75fdc068f656d4d27df9" exitCode=0 Apr 22 17:53:18.315877 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.315868 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"2da95f18ca20e19579a610c2cfcdcf0649bb348e176b75fdc068f656d4d27df9"} Apr 22 17:53:18.319302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.319269 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" event={"ID":"b920c1ec-1c95-459e-a9cf-a36565ac5b48","Type":"ContainerStarted","Data":"7fe51dddc0a2d1d6855f094c99c11c136f6b7e393693908cb3e0116aa94a0690"} Apr 22 17:53:18.319567 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.319550 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:18.319627 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.319573 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:18.333450 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.333430 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:18.359392 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:18.359357 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" podStartSLOduration=8.971289286 podStartE2EDuration="26.359347143s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.946903132 +0000 UTC m=+3.255534603" lastFinishedPulling="2026-04-22 17:53:12.334960995 +0000 UTC m=+20.643592460" observedRunningTime="2026-04-22 17:53:18.358926686 +0000 UTC m=+26.667558172" watchObservedRunningTime="2026-04-22 17:53:18.359347143 +0000 UTC m=+26.667978629" Apr 22 17:53:19.207853 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.207829 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:19.207949 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:19.207937 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:19.322555 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.322497 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="5da779a168734dee991bd70495ebc435797a4d53938e57071694805e6d69ca82" exitCode=0 Apr 22 17:53:19.322850 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.322578 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"5da779a168734dee991bd70495ebc435797a4d53938e57071694805e6d69ca82"} Apr 22 17:53:19.323129 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.323100 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:19.338074 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.338052 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:19.358257 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.358232 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4phwt"] Apr 22 17:53:19.358372 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.358341 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:19.358439 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:19.358422 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:19.360664 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.360645 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7kpf"] Apr 22 17:53:19.360748 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.360741 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:19.360834 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:19.360815 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:19.375698 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.375677 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ldvlp"] Apr 22 17:53:19.375804 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:19.375792 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:19.375890 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:19.375868 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:20.325953 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:20.325871 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="269ba88e12a630a0d2b23af40bc081be274a43b7a0b542b393dddd58dd1658e6" exitCode=0 Apr 22 17:53:20.326329 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:20.325963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"269ba88e12a630a0d2b23af40bc081be274a43b7a0b542b393dddd58dd1658e6"} Apr 22 17:53:21.208325 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:21.208294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:21.208477 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:21.208299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:21.208477 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:21.208426 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:21.208477 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:21.208299 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:21.208611 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:21.208474 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:21.208611 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:21.208591 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:23.208327 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:23.208088 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:23.208862 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:23.208090 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:23.208862 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:23.208374 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ldvlp" podUID="0289f618-f4aa-4688-a261-c755d1a71444" Apr 22 17:53:23.208862 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:23.208090 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:23.208862 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:23.208457 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-k7kpf" podUID="ab99124f-2959-4b17-ab76-24041f074fe5" Apr 22 17:53:23.208862 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:23.208534 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4phwt" podUID="d950d834-86a0-437a-b1c6-30e88678d30b" Apr 22 17:53:24.991292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:24.991216 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-118.ec2.internal" event="NodeReady" Apr 22 17:53:24.991757 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:24.991336 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:53:25.032011 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.031980 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv"] Apr 22 17:53:25.060236 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.060093 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:53:25.060236 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.060214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" Apr 22 17:53:25.062779 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.062757 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.063242 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.063018 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zqrv4\"" Apr 22 17:53:25.063415 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.063395 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.075389 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.075363 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7v4cv"] Apr 22 17:53:25.093853 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.093707 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b747876cb-7f77q"] Apr 22 17:53:25.093853 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.093805 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.094589 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.094134 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.096425 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096403 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 17:53:25.096545 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096452 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 17:53:25.096639 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096617 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tlhcz\"" Apr 22 17:53:25.096695 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096667 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 17:53:25.096804 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096784 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.096906 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096893 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:53:25.096984 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.096967 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:53:25.097110 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.097089 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9zj4r\"" Apr 22 17:53:25.097668 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.097624 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.106590 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.106566 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv"] Apr 22 17:53:25.106683 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.106599 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7v4cv"] Apr 22 17:53:25.106683 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.106615 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:53:25.106683 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.106650 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz"] Apr 22 17:53:25.107227 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.107098 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.109511 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.109489 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 17:53:25.109594 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.109525 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:53:25.110098 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110080 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.110098 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110090 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:53:25.110224 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110089 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:53:25.110224 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:53:25.110390 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gh445\"" Apr 22 17:53:25.110479 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110411 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:53:25.110479 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.110472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.127654 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.127635 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rnpt6"] Apr 22 17:53:25.127796 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.127781 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.130477 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.130456 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 17:53:25.130566 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.130520 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 17:53:25.130634 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.130620 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-fnq5q\"" Apr 22 17:53:25.130771 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.130751 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.131270 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.131081 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.148249 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.148228 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vkb44"] Apr 22 17:53:25.148395 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.148377 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.150898 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.150878 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:53:25.150992 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.150947 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.151048 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.151007 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dchxj\"" Apr 22 17:53:25.151241 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.151210 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.168391 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0972f1d3-8168-44be-896c-c3d80cd4c9d7-serving-cert\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.168497 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168411 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168497 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168497 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168642 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168523 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4k64\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168642 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168563 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcx74\" (UniqueName: \"kubernetes.io/projected/0972f1d3-8168-44be-896c-c3d80cd4c9d7-kube-api-access-wcx74\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.168642 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g68p2\" (UniqueName: \"kubernetes.io/projected/418e6314-c842-4a4a-82f4-6daab5c36653-kube-api-access-g68p2\") pod \"volume-data-source-validator-7c6cbb6c87-vqhdv\" (UID: \"418e6314-c842-4a4a-82f4-6daab5c36653\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" Apr 22 17:53:25.168642 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-config\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.168822 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-trusted-ca\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.168822 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168681 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168822 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168710 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168822 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.168822 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.168786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.169356 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.169316 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx"] Apr 22 17:53:25.169506 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.169488 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.171996 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.171977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:53:25.172095 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.171981 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.172199 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.172176 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.172301 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.172289 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-r5m7l\"" Apr 22 17:53:25.172361 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.172315 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:53:25.177827 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.177808 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:53:25.184986 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.184967 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb"] Apr 22 17:53:25.185116 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.185098 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.187464 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.187444 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.187574 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.187481 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.187574 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.187487 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 17:53:25.187773 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.187758 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 17:53:25.203428 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.203410 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh"] Apr 22 17:53:25.203589 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.203573 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.209477 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.209456 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 17:53:25.209564 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.209465 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 17:53:25.209802 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.209783 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.209988 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.209967 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-7mfdm\"" Apr 22 17:53:25.210205 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.210186 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.220877 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.220856 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx"] Apr 22 17:53:25.221011 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.220995 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:25.221089 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.221017 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.221146 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.220996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:25.221198 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.221138 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:25.224029 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.224012 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:53:25.224169 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.224154 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rkw79\"" Apr 22 17:53:25.226302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.225976 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c9cn8\"" Apr 22 17:53:25.226302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226059 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:53:25.226302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226083 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.226302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226109 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.226302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226200 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:53:25.226614 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226368 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:53:25.226614 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226410 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.226614 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226467 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.226814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.226794 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7sslk\"" Apr 22 17:53:25.236856 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.236700 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8"] Apr 22 17:53:25.236856 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.236850 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" Apr 22 17:53:25.238944 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.238928 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-x7svr\"" Apr 22 17:53:25.256079 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.256019 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b747876cb-7f77q"] Apr 22 17:53:25.256079 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.256052 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr"] Apr 22 17:53:25.256285 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.256178 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.258806 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.258789 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:53:25.258928 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.258804 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-kwq7h\"" Apr 22 17:53:25.259022 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.258830 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:25.259087 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.259049 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:25.269593 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gm7c\" (UniqueName: \"kubernetes.io/projected/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-kube-api-access-8gm7c\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.269706 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269614 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.269706 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.269706 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269682 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-stats-auth\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.269887 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtzs\" (UniqueName: \"kubernetes.io/projected/fa19e254-4e3d-4822-81d3-7ea095625185-kube-api-access-jgtzs\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.269887 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.269796 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:25.269887 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.269813 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:25.269887 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269824 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.269887 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.269868 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.769849522 +0000 UTC m=+34.078480987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.269921 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-default-certificate\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270024 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm"] Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-klusterlet-config\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270058 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-service-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270085 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.270134 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4k64\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.270459 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-trusted-ca\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.270459 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.270459 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270301 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.270459 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.270459 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-snapshots\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.270785 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwwhq\" (UniqueName: \"kubernetes.io/projected/75454aa1-9f9c-481a-b5e0-248d97ce5213-kube-api-access-dwwhq\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.270785 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0972f1d3-8168-44be-896c-c3d80cd4c9d7-serving-cert\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.270785 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.270785 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270562 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-tmp\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.270785 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.270585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99308cc1-5395-417c-bf2d-54fe0c5411d7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.271643 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271613 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.271766 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-tmp\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.271766 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271706 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.271766 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.271766 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6lv\" (UniqueName: \"kubernetes.io/projected/99308cc1-5395-417c-bf2d-54fe0c5411d7-kube-api-access-9r6lv\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271856 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99308cc1-5395-417c-bf2d-54fe0c5411d7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcx74\" (UniqueName: \"kubernetes.io/projected/0972f1d3-8168-44be-896c-c3d80cd4c9d7-kube-api-access-wcx74\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.271960 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsrk6\" (UniqueName: \"kubernetes.io/projected/224a42db-ff4d-4e18-a064-b7f2a7b10e91-kube-api-access-wsrk6\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.272238 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271961 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-trusted-ca\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.272238 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.271980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g68p2\" (UniqueName: \"kubernetes.io/projected/418e6314-c842-4a4a-82f4-6daab5c36653-kube-api-access-g68p2\") pod \"volume-data-source-validator-7c6cbb6c87-vqhdv\" (UID: \"418e6314-c842-4a4a-82f4-6daab5c36653\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" Apr 22 17:53:25.272238 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.272013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-config\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.272238 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.272046 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75454aa1-9f9c-481a-b5e0-248d97ce5213-serving-cert\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.273573 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.272907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0972f1d3-8168-44be-896c-c3d80cd4c9d7-config\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.277647 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.274992 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 17:53:25.277647 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.275283 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 17:53:25.277647 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.275509 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 17:53:25.277647 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.275775 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 17:53:25.278458 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.278058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0972f1d3-8168-44be-896c-c3d80cd4c9d7-serving-cert\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.278458 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.278080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.282235 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.282002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcx74\" (UniqueName: \"kubernetes.io/projected/0972f1d3-8168-44be-896c-c3d80cd4c9d7-kube-api-access-wcx74\") pod \"console-operator-9d4b6777b-7v4cv\" (UID: \"0972f1d3-8168-44be-896c-c3d80cd4c9d7\") " pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.282235 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.282035 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.282639 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.282615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.283374 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.283356 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g68p2\" (UniqueName: \"kubernetes.io/projected/418e6314-c842-4a4a-82f4-6daab5c36653-kube-api-access-g68p2\") pod \"volume-data-source-validator-7c6cbb6c87-vqhdv\" (UID: \"418e6314-c842-4a4a-82f4-6daab5c36653\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" Apr 22 17:53:25.283580 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.283560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4k64\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.293870 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293848 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vkb44"] Apr 22 17:53:25.293870 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293874 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rnpt6"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293884 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293904 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293915 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293926 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293936 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293951 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293961 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293978 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zs6sw"] Apr 22 17:53:25.294025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.293996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.297493 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.297475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 17:53:25.297708 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.297691 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-7g774\"" Apr 22 17:53:25.311214 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.311195 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zs6sw"] Apr 22 17:53:25.311335 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.311323 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.313914 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.313896 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:53:25.313914 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.313908 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rhqnx\"" Apr 22 17:53:25.314052 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.314002 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:53:25.371511 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.371469 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" Apr 22 17:53:25.373420 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.373543 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv66b\" (UniqueName: \"kubernetes.io/projected/0b69db01-4663-4db0-84fe-b0eaeccdfb5a-kube-api-access-zv66b\") pod \"network-check-source-8894fc9bd-9wdzx\" (UID: \"0b69db01-4663-4db0-84fe-b0eaeccdfb5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" Apr 22 17:53:25.373543 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373525 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373578 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dc88d73f-15f0-4054-82fd-935550f076e2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.373584 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373628 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqv5h\" (UniqueName: \"kubernetes.io/projected/dc88d73f-15f0-4054-82fd-935550f076e2-kube-api-access-pqv5h\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.373650 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.373646 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.873625447 +0000 UTC m=+34.182256930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/242c5d0e-f778-473e-a7b6-3a94132fea7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fnl\" (UniqueName: \"kubernetes.io/projected/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-kube-api-access-49fnl\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-snapshots\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373884 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/28c65550-3cca-4589-82a4-baaf985beda6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373915 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.373968 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.373966 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwwhq\" (UniqueName: \"kubernetes.io/projected/75454aa1-9f9c-481a-b5e0-248d97ce5213-kube-api-access-dwwhq\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-tmp\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99308cc1-5395-417c-bf2d-54fe0c5411d7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-tmp\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374121 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.374146 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.874126964 +0000 UTC m=+34.182758443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374179 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6lv\" (UniqueName: \"kubernetes.io/projected/99308cc1-5395-417c-bf2d-54fe0c5411d7-kube-api-access-9r6lv\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99308cc1-5395-417c-bf2d-54fe0c5411d7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.374292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsrk6\" (UniqueName: \"kubernetes.io/projected/224a42db-ff4d-4e18-a064-b7f2a7b10e91-kube-api-access-wsrk6\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nv46\" (UniqueName: \"kubernetes.io/projected/242c5d0e-f778-473e-a7b6-3a94132fea7c-kube-api-access-6nv46\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374386 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknbh\" (UniqueName: \"kubernetes.io/projected/3267974c-a8ce-4fa1-98cf-6213634080a0-kube-api-access-vknbh\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374427 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75454aa1-9f9c-481a-b5e0-248d97ce5213-serving-cert\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gm7c\" (UniqueName: \"kubernetes.io/projected/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-kube-api-access-8gm7c\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-tmp\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-stats-auth\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374538 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/75454aa1-9f9c-481a-b5e0-248d97ce5213-snapshots\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-tmp\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374539 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtzs\" (UniqueName: \"kubernetes.io/projected/fa19e254-4e3d-4822-81d3-7ea095625185-kube-api-access-jgtzs\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.374576 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242c5d0e-f778-473e-a7b6-3a94132fea7c-config\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.374629 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.874613243 +0000 UTC m=+34.183244724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-default-certificate\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-klusterlet-config\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.374831 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsz49\" (UniqueName: \"kubernetes.io/projected/28c65550-3cca-4589-82a4-baaf985beda6-kube-api-access-jsz49\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.375602 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-service-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.375602 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.374970 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.375602 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.375022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99308cc1-5395-417c-bf2d-54fe0c5411d7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.375602 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.375327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75454aa1-9f9c-481a-b5e0-248d97ce5213-service-ca-bundle\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.377224 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.377198 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99308cc1-5395-417c-bf2d-54fe0c5411d7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.377324 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.377313 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75454aa1-9f9c-481a-b5e0-248d97ce5213-serving-cert\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.378302 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.378283 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-klusterlet-config\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.384019 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.383977 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwwhq\" (UniqueName: \"kubernetes.io/projected/75454aa1-9f9c-481a-b5e0-248d97ce5213-kube-api-access-dwwhq\") pod \"insights-operator-585dfdc468-vkb44\" (UID: \"75454aa1-9f9c-481a-b5e0-248d97ce5213\") " pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.384691 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.384648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gm7c\" (UniqueName: \"kubernetes.io/projected/7f7140d2-3c3c-477e-ab7b-229503f3cbd9-kube-api-access-8gm7c\") pod \"klusterlet-addon-workmgr-697d7b9785-dwksx\" (UID: \"7f7140d2-3c3c-477e-ab7b-229503f3cbd9\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.384849 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.384815 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6lv\" (UniqueName: \"kubernetes.io/projected/99308cc1-5395-417c-bf2d-54fe0c5411d7-kube-api-access-9r6lv\") pod \"kube-storage-version-migrator-operator-6769c5d45-kwxxz\" (UID: \"99308cc1-5395-417c-bf2d-54fe0c5411d7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.385094 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.385072 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-stats-auth\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.385163 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.385108 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-default-certificate\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.385322 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.385303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsrk6\" (UniqueName: \"kubernetes.io/projected/224a42db-ff4d-4e18-a064-b7f2a7b10e91-kube-api-access-wsrk6\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.385445 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.385431 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtzs\" (UniqueName: \"kubernetes.io/projected/fa19e254-4e3d-4822-81d3-7ea095625185-kube-api-access-jgtzs\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.414521 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.414494 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:25.436254 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.436234 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" Apr 22 17:53:25.475132 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dc88d73f-15f0-4054-82fd-935550f076e2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475157 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pqv5h\" (UniqueName: \"kubernetes.io/projected/dc88d73f-15f0-4054-82fd-935550f076e2-kube-api-access-pqv5h\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/242c5d0e-f778-473e-a7b6-3a94132fea7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475246 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49fnl\" (UniqueName: \"kubernetes.io/projected/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-kube-api-access-49fnl\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475256 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.475254 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475295 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/28c65550-3cca-4589-82a4-baaf985beda6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.475316 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.975295499 +0000 UTC m=+34.283926972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475354 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0708c3-8b05-45e1-9d30-ca3772151671-tmp-dir\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.475896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475608 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f0708c3-8b05-45e1-9d30-ca3772151671-config-volume\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.475896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.475896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nv46\" (UniqueName: \"kubernetes.io/projected/242c5d0e-f778-473e-a7b6-3a94132fea7c-kube-api-access-6nv46\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.475896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.475722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vknbh\" (UniqueName: \"kubernetes.io/projected/3267974c-a8ce-4fa1-98cf-6213634080a0-kube-api-access-vknbh\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.476133 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/28c65550-3cca-4589-82a4-baaf985beda6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242c5d0e-f778-473e-a7b6-3a94132fea7c-config\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/8f0708c3-8b05-45e1-9d30-ca3772151671-kube-api-access-xqz5k\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsz49\" (UniqueName: \"kubernetes.io/projected/28c65550-3cca-4589-82a4-baaf985beda6-kube-api-access-jsz49\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv66b\" (UniqueName: \"kubernetes.io/projected/0b69db01-4663-4db0-84fe-b0eaeccdfb5a-kube-api-access-zv66b\") pod \"network-check-source-8894fc9bd-9wdzx\" (UID: \"0b69db01-4663-4db0-84fe-b0eaeccdfb5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.476538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.476649 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:25.477103 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.476710 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:25.97669326 +0000 UTC m=+34.285324727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:25.477654 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.477636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/242c5d0e-f778-473e-a7b6-3a94132fea7c-config\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.478292 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.478238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/242c5d0e-f778-473e-a7b6-3a94132fea7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.478407 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.478317 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/dc88d73f-15f0-4054-82fd-935550f076e2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.478811 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.478787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.478896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.478809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.479182 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.479164 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vkb44" Apr 22 17:53:25.479492 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.479473 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.479624 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.479604 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-hub\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.480790 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.480768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.484769 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.484746 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknbh\" (UniqueName: \"kubernetes.io/projected/3267974c-a8ce-4fa1-98cf-6213634080a0-kube-api-access-vknbh\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.484769 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.484758 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nv46\" (UniqueName: \"kubernetes.io/projected/242c5d0e-f778-473e-a7b6-3a94132fea7c-kube-api-access-6nv46\") pod \"service-ca-operator-d6fc45fc5-sgbxb\" (UID: \"242c5d0e-f778-473e-a7b6-3a94132fea7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.484996 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.484954 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqv5h\" (UniqueName: \"kubernetes.io/projected/dc88d73f-15f0-4054-82fd-935550f076e2-kube-api-access-pqv5h\") pod \"managed-serviceaccount-addon-agent-86cdd58467-cj5lm\" (UID: \"dc88d73f-15f0-4054-82fd-935550f076e2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.485343 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.485324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv66b\" (UniqueName: \"kubernetes.io/projected/0b69db01-4663-4db0-84fe-b0eaeccdfb5a-kube-api-access-zv66b\") pod \"network-check-source-8894fc9bd-9wdzx\" (UID: \"0b69db01-4663-4db0-84fe-b0eaeccdfb5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" Apr 22 17:53:25.485437 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.485421 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsz49\" (UniqueName: \"kubernetes.io/projected/28c65550-3cca-4589-82a4-baaf985beda6-kube-api-access-jsz49\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.486256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.486235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fnl\" (UniqueName: \"kubernetes.io/projected/7cd04bd0-8da0-4aa1-8212-af5aa3c652d6-kube-api-access-49fnl\") pod \"cluster-proxy-proxy-agent-86784fd9d-bgfcr\" (UID: \"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.497943 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.497894 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:25.513055 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.513031 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" Apr 22 17:53:25.560433 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.560399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" Apr 22 17:53:25.577766 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.577717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/8f0708c3-8b05-45e1-9d30-ca3772151671-kube-api-access-xqz5k\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.577913 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.577849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.577913 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.577897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0708c3-8b05-45e1-9d30-ca3772151671-tmp-dir\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.578018 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.577940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f0708c3-8b05-45e1-9d30-ca3772151671-config-volume\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.578018 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.577982 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:25.578101 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.578060 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.078039127 +0000 UTC m=+34.386670598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:25.578348 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.578327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8f0708c3-8b05-45e1-9d30-ca3772151671-tmp-dir\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.578596 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.578573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f0708c3-8b05-45e1-9d30-ca3772151671-config-volume\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.588374 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.588349 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz5k\" (UniqueName: \"kubernetes.io/projected/8f0708c3-8b05-45e1-9d30-ca3772151671-kube-api-access-xqz5k\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:25.619475 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.619443 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:53:25.626187 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.626157 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" Apr 22 17:53:25.779747 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.779651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:25.779943 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.779822 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:25.779943 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.779845 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:25.779943 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.779903 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.779888154 +0000 UTC m=+35.088519618 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:25.880194 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.880162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.880372 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.880207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:25.880372 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.880291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:25.880372 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880360 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.880322179 +0000 UTC m=+35.188953647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:25.880540 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.880414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:25.880540 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880423 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:25.880540 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880437 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:25.880540 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880526 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:53:25.880540 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880490 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.880473507 +0000 UTC m=+35.189104976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:25.880701 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880568 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.880550436 +0000 UTC m=+35.189181912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:25.880701 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.880587 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs podName:ab99124f-2959-4b17-ab76-24041f074fe5 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.880578136 +0000 UTC m=+66.189209604 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs") pod "network-metrics-daemon-k7kpf" (UID: "ab99124f-2959-4b17-ab76-24041f074fe5") : secret "metrics-daemon-secret" not found Apr 22 17:53:25.983568 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.982574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:25.983568 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.982777 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:25.983568 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.982825 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:25.983568 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.982962 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:25.983568 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.983038 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.983019258 +0000 UTC m=+35.291650726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:25.983984 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.983943 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:25.984036 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:25.984000 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:26.983983416 +0000 UTC m=+35.292614884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:25.992045 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:25.991165 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjpp8\" (UniqueName: \"kubernetes.io/projected/d950d834-86a0-437a-b1c6-30e88678d30b-kube-api-access-bjpp8\") pod \"network-check-target-4phwt\" (UID: \"d950d834-86a0-437a-b1c6-30e88678d30b\") " pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:26.088340 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.083636 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:26.088482 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.085447 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:26.088553 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.088541 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:27.088518068 +0000 UTC m=+35.397149540 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:26.134217 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.133655 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:26.206342 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.206311 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx"] Apr 22 17:53:26.211800 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.211778 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz"] Apr 22 17:53:26.211870 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.211809 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vkb44"] Apr 22 17:53:26.216228 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.216212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr"] Apr 22 17:53:26.219787 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.219764 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv"] Apr 22 17:53:26.221084 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.221052 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm"] Apr 22 17:53:26.223123 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.223097 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-7v4cv"] Apr 22 17:53:26.233929 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.233892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb"] Apr 22 17:53:26.234935 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.234918 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx"] Apr 22 17:53:26.252585 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.252559 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b69db01_4663_4db0_84fe_b0eaeccdfb5a.slice/crio-1d623f7781adfd5a41c76fc06a04a851119eef804b73adc07d7234688116db7a WatchSource:0}: Error finding container 1d623f7781adfd5a41c76fc06a04a851119eef804b73adc07d7234688116db7a: Status 404 returned error can't find the container with id 1d623f7781adfd5a41c76fc06a04a851119eef804b73adc07d7234688116db7a Apr 22 17:53:26.252965 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.252941 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99308cc1_5395_417c_bf2d_54fe0c5411d7.slice/crio-234f2bdc420eb57612f64c844cb802a9cff3aaa2463f1959ffe0aeb811fdff66 WatchSource:0}: Error finding container 234f2bdc420eb57612f64c844cb802a9cff3aaa2463f1959ffe0aeb811fdff66: Status 404 returned error can't find the container with id 234f2bdc420eb57612f64c844cb802a9cff3aaa2463f1959ffe0aeb811fdff66 Apr 22 17:53:26.253760 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.253738 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75454aa1_9f9c_481a_b5e0_248d97ce5213.slice/crio-4618bfb750780d61df5a889ca3fd40de9d5bda3b3a0a0166aa329d6c14475f1c WatchSource:0}: Error finding container 4618bfb750780d61df5a889ca3fd40de9d5bda3b3a0a0166aa329d6c14475f1c: Status 404 returned error can't find the container with id 4618bfb750780d61df5a889ca3fd40de9d5bda3b3a0a0166aa329d6c14475f1c Apr 22 17:53:26.254594 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.254548 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd04bd0_8da0_4aa1_8212_af5aa3c652d6.slice/crio-46ec13942201b00e0a575af2e7a1c01e462af893fa538dcf5fd658a5632bef4a WatchSource:0}: Error finding container 46ec13942201b00e0a575af2e7a1c01e462af893fa538dcf5fd658a5632bef4a: Status 404 returned error can't find the container with id 46ec13942201b00e0a575af2e7a1c01e462af893fa538dcf5fd658a5632bef4a Apr 22 17:53:26.255758 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.255645 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0972f1d3_8168_44be_896c_c3d80cd4c9d7.slice/crio-a44e745bdaa846cd58ffd75faf5add3d51e0c1a4ad8801d5035a4c223603eb29 WatchSource:0}: Error finding container a44e745bdaa846cd58ffd75faf5add3d51e0c1a4ad8801d5035a4c223603eb29: Status 404 returned error can't find the container with id a44e745bdaa846cd58ffd75faf5add3d51e0c1a4ad8801d5035a4c223603eb29 Apr 22 17:53:26.256495 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.256439 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc88d73f_15f0_4054_82fd_935550f076e2.slice/crio-1e7e7c50b442a8759c561bfc6204e82656c357bee86d0a4f855f51a445df6f04 WatchSource:0}: Error finding container 1e7e7c50b442a8759c561bfc6204e82656c357bee86d0a4f855f51a445df6f04: Status 404 returned error can't find the container with id 1e7e7c50b442a8759c561bfc6204e82656c357bee86d0a4f855f51a445df6f04 Apr 22 17:53:26.257865 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.257304 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418e6314_c842_4a4a_82f4_6daab5c36653.slice/crio-208963e3de76a7e34ba3ff9a8c5c20ae91f9e3ee7e847d66b51ef7a270ea257d WatchSource:0}: Error finding container 208963e3de76a7e34ba3ff9a8c5c20ae91f9e3ee7e847d66b51ef7a270ea257d: Status 404 returned error can't find the container with id 208963e3de76a7e34ba3ff9a8c5c20ae91f9e3ee7e847d66b51ef7a270ea257d Apr 22 17:53:26.260685 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.260614 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod242c5d0e_f778_473e_a7b6_3a94132fea7c.slice/crio-616c1c4a64501b931b325e69f49aa8a81b3fc3cfdcfc307f4962c50fdbaae9de WatchSource:0}: Error finding container 616c1c4a64501b931b325e69f49aa8a81b3fc3cfdcfc307f4962c50fdbaae9de: Status 404 returned error can't find the container with id 616c1c4a64501b931b325e69f49aa8a81b3fc3cfdcfc307f4962c50fdbaae9de Apr 22 17:53:26.262256 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.262232 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7140d2_3c3c_477e_ab7b_229503f3cbd9.slice/crio-6c2ce51806c092bf9ac3409c12571743b9f37f5afc603a4c8731ea44387c4d27 WatchSource:0}: Error finding container 6c2ce51806c092bf9ac3409c12571743b9f37f5afc603a4c8731ea44387c4d27: Status 404 returned error can't find the container with id 6c2ce51806c092bf9ac3409c12571743b9f37f5afc603a4c8731ea44387c4d27 Apr 22 17:53:26.340539 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.340506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" event={"ID":"242c5d0e-f778-473e-a7b6-3a94132fea7c","Type":"ContainerStarted","Data":"616c1c4a64501b931b325e69f49aa8a81b3fc3cfdcfc307f4962c50fdbaae9de"} Apr 22 17:53:26.343572 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.342340 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" event={"ID":"418e6314-c842-4a4a-82f4-6daab5c36653","Type":"ContainerStarted","Data":"208963e3de76a7e34ba3ff9a8c5c20ae91f9e3ee7e847d66b51ef7a270ea257d"} Apr 22 17:53:26.346075 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.346043 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerStarted","Data":"46ec13942201b00e0a575af2e7a1c01e462af893fa538dcf5fd658a5632bef4a"} Apr 22 17:53:26.347180 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.347143 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" event={"ID":"0972f1d3-8168-44be-896c-c3d80cd4c9d7","Type":"ContainerStarted","Data":"a44e745bdaa846cd58ffd75faf5add3d51e0c1a4ad8801d5035a4c223603eb29"} Apr 22 17:53:26.348050 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.348028 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" event={"ID":"99308cc1-5395-417c-bf2d-54fe0c5411d7","Type":"ContainerStarted","Data":"234f2bdc420eb57612f64c844cb802a9cff3aaa2463f1959ffe0aeb811fdff66"} Apr 22 17:53:26.349270 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.349246 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" event={"ID":"0b69db01-4663-4db0-84fe-b0eaeccdfb5a","Type":"ContainerStarted","Data":"1d623f7781adfd5a41c76fc06a04a851119eef804b73adc07d7234688116db7a"} Apr 22 17:53:26.350350 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.350326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" event={"ID":"7f7140d2-3c3c-477e-ab7b-229503f3cbd9","Type":"ContainerStarted","Data":"6c2ce51806c092bf9ac3409c12571743b9f37f5afc603a4c8731ea44387c4d27"} Apr 22 17:53:26.351372 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.351353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" event={"ID":"dc88d73f-15f0-4054-82fd-935550f076e2","Type":"ContainerStarted","Data":"1e7e7c50b442a8759c561bfc6204e82656c357bee86d0a4f855f51a445df6f04"} Apr 22 17:53:26.352349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.352330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vkb44" event={"ID":"75454aa1-9f9c-481a-b5e0-248d97ce5213","Type":"ContainerStarted","Data":"4618bfb750780d61df5a889ca3fd40de9d5bda3b3a0a0166aa329d6c14475f1c"} Apr 22 17:53:26.414120 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.414088 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4phwt"] Apr 22 17:53:26.417255 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:26.417223 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd950d834_86a0_437a_b1c6_30e88678d30b.slice/crio-9cea70123888e8a943a12c881e62fed160852dbfc3db1441f692e3c33216e593 WatchSource:0}: Error finding container 9cea70123888e8a943a12c881e62fed160852dbfc3db1441f692e3c33216e593: Status 404 returned error can't find the container with id 9cea70123888e8a943a12c881e62fed160852dbfc3db1441f692e3c33216e593 Apr 22 17:53:26.795502 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.795470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:26.795662 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.795621 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:26.795662 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.795638 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:26.795799 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.795704 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.79568424 +0000 UTC m=+37.104315709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.895939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.896014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.896109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.896299 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.896361 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.896343165 +0000 UTC m=+37.204974629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.896415 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.896403666 +0000 UTC m=+37.205035137 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.896490 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:26.896624 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.896525 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.896512817 +0000 UTC m=+37.205144282 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:26.997650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.997612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:26.998192 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:26.997671 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:26.998192 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.998056 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:26.998192 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.998131 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.998111206 +0000 UTC m=+37.306742676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:26.998626 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.998542 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:26.998626 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:26.998595 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:28.998579622 +0000 UTC m=+37.307211089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:27.099136 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:27.099100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:27.099410 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:27.099393 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:27.099484 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:27.099458 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:29.0994416 +0000 UTC m=+37.408073071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:27.359368 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:27.359282 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4phwt" event={"ID":"d950d834-86a0-437a-b1c6-30e88678d30b","Type":"ContainerStarted","Data":"9cea70123888e8a943a12c881e62fed160852dbfc3db1441f692e3c33216e593"} Apr 22 17:53:27.372047 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:27.371991 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="4ca330125cf3ee754b532777cfe6d5e251e2e1ccb23cfad31ec8af2b7ba11a11" exitCode=0 Apr 22 17:53:27.372207 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:27.372052 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"4ca330125cf3ee754b532777cfe6d5e251e2e1ccb23cfad31ec8af2b7ba11a11"} Apr 22 17:53:28.394953 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.394396 2568 generic.go:358] "Generic (PLEG): container finished" podID="e29ab8a7-8881-4951-93eb-55d0b996dbcb" containerID="bbca1ef81e72ce1b15744323cad6a3f2c1bb86cf7516222df6f5c1469afe40a8" exitCode=0 Apr 22 17:53:28.396316 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.396266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerDied","Data":"bbca1ef81e72ce1b15744323cad6a3f2c1bb86cf7516222df6f5c1469afe40a8"} Apr 22 17:53:28.822933 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.822145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:28.822933 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.822324 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:28.822933 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.822339 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:28.822933 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.822407 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.822381055 +0000 UTC m=+41.131012523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.924119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.924180 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:28.924275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.924445 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.924508 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.924490305 +0000 UTC m=+41.233121777 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.924681 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.924759 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.924741427 +0000 UTC m=+41.233372908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:28.924925 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:28.924830 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.92482015 +0000 UTC m=+41.233451623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:29.025062 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:29.025022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:29.025234 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:29.025084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:29.025753 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.025346 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:29.025753 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.025556 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:29.025753 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.025618 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.025598714 +0000 UTC m=+41.334230196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:29.026405 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.026355 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.026337157 +0000 UTC m=+41.334968628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:29.126850 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:29.126535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:29.126850 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.126769 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:29.126850 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:29.126840 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:33.126817053 +0000 UTC m=+41.435448522 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:32.654086 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.654051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:32.657849 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.657831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0289f618-f4aa-4688-a261-c755d1a71444-original-pull-secret\") pod \"global-pull-secret-syncer-ldvlp\" (UID: \"0289f618-f4aa-4688-a261-c755d1a71444\") " pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:32.753976 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.753952 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ldvlp" Apr 22 17:53:32.854978 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.854953 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:32.855105 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.855061 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:32.855105 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.855074 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:32.855173 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.855124 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.855107487 +0000 UTC m=+49.163738954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:32.956009 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.955973 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:32.956176 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.956139 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:32.956239 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.956197 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:32.956239 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.956206 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.956185562 +0000 UTC m=+49.264817026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:32.956351 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:32.956249 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:32.956351 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.956347 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:32.956456 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.956368 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.956355058 +0000 UTC m=+49.264986548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:32.956456 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:32.956392 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:40.956379058 +0000 UTC m=+49.265010522 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:33.057199 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:33.057163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:33.057415 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:33.057210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:33.057415 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.057325 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:33.057415 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.057395 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.057376277 +0000 UTC m=+49.366007752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:33.057415 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.057398 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:33.057636 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.057451 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.057439251 +0000 UTC m=+49.366070715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:33.157976 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:33.157934 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:33.158160 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.158089 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:33.158160 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:33.158157 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:41.158141368 +0000 UTC m=+49.466772833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:37.429244 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:37.429150 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" event={"ID":"e29ab8a7-8881-4951-93eb-55d0b996dbcb","Type":"ContainerStarted","Data":"1e6ea831bd03bdddccb999106490f420086bb8cc002d07f5598922a9c8025b12"} Apr 22 17:53:37.453151 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:37.453089 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s7s7v" podStartSLOduration=14.089956003 podStartE2EDuration="45.453072346s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:52:54.949928061 +0000 UTC m=+3.258559527" lastFinishedPulling="2026-04-22 17:53:26.313044394 +0000 UTC m=+34.621675870" observedRunningTime="2026-04-22 17:53:37.451003174 +0000 UTC m=+45.759634661" watchObservedRunningTime="2026-04-22 17:53:37.453072346 +0000 UTC m=+45.761703835" Apr 22 17:53:38.611450 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:38.611427 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ldvlp"] Apr 22 17:53:38.616779 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:38.616746 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0289f618_f4aa_4688_a261_c755d1a71444.slice/crio-16aa09760f9831a737692903764f52dcc1d056658d703d6a34b33fce810863e8 WatchSource:0}: Error finding container 16aa09760f9831a737692903764f52dcc1d056658d703d6a34b33fce810863e8: Status 404 returned error can't find the container with id 16aa09760f9831a737692903764f52dcc1d056658d703d6a34b33fce810863e8 Apr 22 17:53:39.437434 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.437197 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" event={"ID":"7f7140d2-3c3c-477e-ab7b-229503f3cbd9","Type":"ContainerStarted","Data":"fea45b1ce531c800ae31bf197a6e1f92ecb25abe07a0faa954f0baefc7e4271a"} Apr 22 17:53:39.437854 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.437759 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:39.439370 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.439337 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" Apr 22 17:53:39.440638 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.440610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" event={"ID":"dc88d73f-15f0-4054-82fd-935550f076e2","Type":"ContainerStarted","Data":"fba14bb3ed3fce61d9802f7e8a61e0a5adead6e46c508b4156cd819395213af3"} Apr 22 17:53:39.442819 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.442796 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vkb44" event={"ID":"75454aa1-9f9c-481a-b5e0-248d97ce5213","Type":"ContainerStarted","Data":"278c13a16c0448b0f3eb2e576339affd86dfa62b95e347188aa2d529d74ab6ca"} Apr 22 17:53:39.443979 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.443947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ldvlp" event={"ID":"0289f618-f4aa-4688-a261-c755d1a71444","Type":"ContainerStarted","Data":"16aa09760f9831a737692903764f52dcc1d056658d703d6a34b33fce810863e8"} Apr 22 17:53:39.445219 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.445195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" event={"ID":"242c5d0e-f778-473e-a7b6-3a94132fea7c","Type":"ContainerStarted","Data":"6e1ed20e77a48b13b446bd441bc034bcee88b2dfd4e6ce70d356157aac5f13a5"} Apr 22 17:53:39.446953 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.446921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" event={"ID":"418e6314-c842-4a4a-82f4-6daab5c36653","Type":"ContainerStarted","Data":"296c3c9984f3822fe5a9cdd16dad127d3eb1a104a9cd0c0a51abea8a633d2fed"} Apr 22 17:53:39.448264 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.448235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerStarted","Data":"8c403f70160dbb600891312adf40f39ab28df7b92f63f335b79dbad8930d571a"} Apr 22 17:53:39.449693 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.449675 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/0.log" Apr 22 17:53:39.449811 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.449712 2568 generic.go:358] "Generic (PLEG): container finished" podID="0972f1d3-8168-44be-896c-c3d80cd4c9d7" containerID="041fead81ba431f2cf88d8c113bdd94944787e6627e19234c4c904769bdbe388" exitCode=255 Apr 22 17:53:39.449811 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.449756 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" event={"ID":"0972f1d3-8168-44be-896c-c3d80cd4c9d7","Type":"ContainerDied","Data":"041fead81ba431f2cf88d8c113bdd94944787e6627e19234c4c904769bdbe388"} Apr 22 17:53:39.450009 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.449992 2568 scope.go:117] "RemoveContainer" containerID="041fead81ba431f2cf88d8c113bdd94944787e6627e19234c4c904769bdbe388" Apr 22 17:53:39.451573 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.451540 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" event={"ID":"99308cc1-5395-417c-bf2d-54fe0c5411d7","Type":"ContainerStarted","Data":"37b807ec5f69f82d60408ee38e836c237eb66fc2f9352dc3b24847531588442a"} Apr 22 17:53:39.453187 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.453164 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" event={"ID":"0b69db01-4663-4db0-84fe-b0eaeccdfb5a","Type":"ContainerStarted","Data":"9edae1d6e0c3898b178863915b491b9e5f5f9318c97e742a8dcf20974ec3e459"} Apr 22 17:53:39.457010 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.456978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4phwt" event={"ID":"d950d834-86a0-437a-b1c6-30e88678d30b","Type":"ContainerStarted","Data":"3595873de9b90eaa3c991593219f77a7fb35c3cca814bfd90d0c5eb588ee19cc"} Apr 22 17:53:39.457308 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.457290 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:53:39.471815 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.471763 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697d7b9785-dwksx" podStartSLOduration=13.211622839 podStartE2EDuration="25.471720373s" podCreationTimestamp="2026-04-22 17:53:14 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.287205014 +0000 UTC m=+34.595836482" lastFinishedPulling="2026-04-22 17:53:38.547302548 +0000 UTC m=+46.855934016" observedRunningTime="2026-04-22 17:53:39.454881603 +0000 UTC m=+47.763513091" watchObservedRunningTime="2026-04-22 17:53:39.471720373 +0000 UTC m=+47.780351865" Apr 22 17:53:39.492412 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.492280 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86cdd58467-cj5lm" podStartSLOduration=13.203307843 podStartE2EDuration="25.492258818s" podCreationTimestamp="2026-04-22 17:53:14 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.25928634 +0000 UTC m=+34.567917806" lastFinishedPulling="2026-04-22 17:53:38.548237309 +0000 UTC m=+46.856868781" observedRunningTime="2026-04-22 17:53:39.472993801 +0000 UTC m=+47.781625289" watchObservedRunningTime="2026-04-22 17:53:39.492258818 +0000 UTC m=+47.800890305" Apr 22 17:53:39.492650 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.492608 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" podStartSLOduration=7.800453975 podStartE2EDuration="18.492601129s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.28739503 +0000 UTC m=+34.596026495" lastFinishedPulling="2026-04-22 17:53:36.979542169 +0000 UTC m=+45.288173649" observedRunningTime="2026-04-22 17:53:39.490806189 +0000 UTC m=+47.799437674" watchObservedRunningTime="2026-04-22 17:53:39.492601129 +0000 UTC m=+47.801232616" Apr 22 17:53:39.535129 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.534684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" podStartSLOduration=8.301561287 podStartE2EDuration="18.534666877s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.25528471 +0000 UTC m=+34.563916176" lastFinishedPulling="2026-04-22 17:53:36.488390286 +0000 UTC m=+44.797021766" observedRunningTime="2026-04-22 17:53:39.509008322 +0000 UTC m=+47.817639814" watchObservedRunningTime="2026-04-22 17:53:39.534666877 +0000 UTC m=+47.843298366" Apr 22 17:53:39.579120 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.579059 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-vkb44" podStartSLOduration=6.28772224 podStartE2EDuration="18.579034617s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.256359034 +0000 UTC m=+34.564990513" lastFinishedPulling="2026-04-22 17:53:38.547671419 +0000 UTC m=+46.856302890" observedRunningTime="2026-04-22 17:53:39.555601663 +0000 UTC m=+47.864233151" watchObservedRunningTime="2026-04-22 17:53:39.579034617 +0000 UTC m=+47.887666105" Apr 22 17:53:39.579493 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.579459 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9wdzx" podStartSLOduration=6.341117368 podStartE2EDuration="18.579450804s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.254746651 +0000 UTC m=+34.563378130" lastFinishedPulling="2026-04-22 17:53:38.49308009 +0000 UTC m=+46.801711566" observedRunningTime="2026-04-22 17:53:39.578174841 +0000 UTC m=+47.886806326" watchObservedRunningTime="2026-04-22 17:53:39.579450804 +0000 UTC m=+47.888082292" Apr 22 17:53:39.622631 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.622575 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vqhdv" podStartSLOduration=6.98199606 podStartE2EDuration="18.622556317s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.287291518 +0000 UTC m=+34.595922997" lastFinishedPulling="2026-04-22 17:53:37.927851771 +0000 UTC m=+46.236483254" observedRunningTime="2026-04-22 17:53:39.621415506 +0000 UTC m=+47.930046995" watchObservedRunningTime="2026-04-22 17:53:39.622556317 +0000 UTC m=+47.931187801" Apr 22 17:53:39.642107 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:39.641900 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4phwt" podStartSLOduration=35.514611938 podStartE2EDuration="47.641879931s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.419359158 +0000 UTC m=+34.727990627" lastFinishedPulling="2026-04-22 17:53:38.546627154 +0000 UTC m=+46.855258620" observedRunningTime="2026-04-22 17:53:39.640120633 +0000 UTC m=+47.948752118" watchObservedRunningTime="2026-04-22 17:53:39.641879931 +0000 UTC m=+47.950511419" Apr 22 17:53:40.462210 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.462179 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 17:53:40.462629 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.462612 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/0.log" Apr 22 17:53:40.462712 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.462657 2568 generic.go:358] "Generic (PLEG): container finished" podID="0972f1d3-8168-44be-896c-c3d80cd4c9d7" containerID="16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1" exitCode=255 Apr 22 17:53:40.462887 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.462803 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" event={"ID":"0972f1d3-8168-44be-896c-c3d80cd4c9d7","Type":"ContainerDied","Data":"16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1"} Apr 22 17:53:40.462887 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.462860 2568 scope.go:117] "RemoveContainer" containerID="041fead81ba431f2cf88d8c113bdd94944787e6627e19234c4c904769bdbe388" Apr 22 17:53:40.463054 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.463042 2568 scope.go:117] "RemoveContainer" containerID="16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1" Apr 22 17:53:40.463488 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:40.463261 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7v4cv_openshift-console-operator(0972f1d3-8168-44be-896c-c3d80cd4c9d7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" podUID="0972f1d3-8168-44be-896c-c3d80cd4c9d7" Apr 22 17:53:40.931943 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:40.931855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:40.932324 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:40.932012 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 17:53:40.932324 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:40.932036 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8599fb6f6c-qtwn7: secret "image-registry-tls" not found Apr 22 17:53:40.932324 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:40.932098 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls podName:75f87f2c-183f-4d31-91cd-2752918acc59 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:56.932082243 +0000 UTC m=+65.240713707 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls") pod "image-registry-8599fb6f6c-qtwn7" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59") : secret "image-registry-tls" not found Apr 22 17:53:41.032505 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.032465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:41.032690 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.032587 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:53:41.032690 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.032593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:41.032690 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.032664 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert podName:fa19e254-4e3d-4822-81d3-7ea095625185 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.032645173 +0000 UTC m=+65.341276639 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert") pod "ingress-canary-rnpt6" (UID: "fa19e254-4e3d-4822-81d3-7ea095625185") : secret "canary-serving-cert" not found Apr 22 17:53:41.032915 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.032701 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.032684851 +0000 UTC m=+65.341316331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : configmap references non-existent config key: service-ca.crt Apr 22 17:53:41.032915 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.032753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:41.032915 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.032849 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:53:41.032915 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.032891 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs podName:224a42db-ff4d-4e18-a064-b7f2a7b10e91 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.032879129 +0000 UTC m=+65.341510597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs") pod "router-default-b747876cb-7f77q" (UID: "224a42db-ff4d-4e18-a064-b7f2a7b10e91") : secret "router-metrics-certs-default" not found Apr 22 17:53:41.133859 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.133822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:41.134042 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.133877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:41.134042 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.133980 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:41.134155 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.134037 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:53:41.134155 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.134041 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls podName:28c65550-3cca-4589-82a4-baaf985beda6 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.134022076 +0000 UTC m=+65.442653542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pmdgh" (UID: "28c65550-3cca-4589-82a4-baaf985beda6") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:53:41.134155 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.134139 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls podName:3267974c-a8ce-4fa1-98cf-6213634080a0 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.134127485 +0000 UTC m=+65.442758953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-lv8r8" (UID: "3267974c-a8ce-4fa1-98cf-6213634080a0") : secret "samples-operator-tls" not found Apr 22 17:53:41.234654 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.234622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:41.234846 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.234780 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:53:41.234899 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.234848 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls podName:8f0708c3-8b05-45e1-9d30-ca3772151671 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.234831406 +0000 UTC m=+65.543462872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls") pod "dns-default-zs6sw" (UID: "8f0708c3-8b05-45e1-9d30-ca3772151671") : secret "dns-default-metrics-tls" not found Apr 22 17:53:41.359504 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.359472 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-77tjc"] Apr 22 17:53:41.380871 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.380840 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-77tjc"] Apr 22 17:53:41.381057 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.380985 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.383799 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.383776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:53:41.383931 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.383786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:53:41.383931 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.383831 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dqcf6\"" Apr 22 17:53:41.437264 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.437227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-data-volume\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.437428 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.437271 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.437428 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.437312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-crio-socket\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.437503 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.437455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljq72\" (UniqueName: \"kubernetes.io/projected/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-api-access-ljq72\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.437503 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.437497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.467262 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.467237 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 17:53:41.467690 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.467675 2568 scope.go:117] "RemoveContainer" containerID="16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1" Apr 22 17:53:41.467860 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.467843 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7v4cv_openshift-console-operator(0972f1d3-8168-44be-896c-c3d80cd4c9d7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" podUID="0972f1d3-8168-44be-896c-c3d80cd4c9d7" Apr 22 17:53:41.538841 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.538762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljq72\" (UniqueName: \"kubernetes.io/projected/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-api-access-ljq72\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.538841 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.538811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539059 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-data-volume\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539112 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539058 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539180 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-crio-socket\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539297 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.539201 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:41.539297 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:41.539260 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls podName:d3db4b4c-de2a-4504-ac79-1a6d59c9892e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:42.039242617 +0000 UTC m=+50.347874095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-77tjc" (UID: "d3db4b4c-de2a-4504-ac79-1a6d59c9892e") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:41.539406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-data-volume\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539406 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539333 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-crio-socket\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.539510 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.539482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.551201 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.551181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljq72\" (UniqueName: \"kubernetes.io/projected/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-kube-api-access-ljq72\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:41.615689 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:41.615668 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sbk9w_3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca/dns-node-resolver/0.log" Apr 22 17:53:42.042672 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:42.042631 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:42.043110 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:42.042796 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:42.043110 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:42.042884 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls podName:d3db4b4c-de2a-4504-ac79-1a6d59c9892e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:43.042863446 +0000 UTC m=+51.351494911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-77tjc" (UID: "d3db4b4c-de2a-4504-ac79-1a6d59c9892e") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:42.417397 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:42.417320 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tjtfp_8c0ae7fd-c205-4928-b51f-9f80202d3f77/node-ca/0.log" Apr 22 17:53:43.054251 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:43.054204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:43.054763 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:43.054368 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:43.054763 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:43.054453 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls podName:d3db4b4c-de2a-4504-ac79-1a6d59c9892e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:45.054436798 +0000 UTC m=+53.363068263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-77tjc" (UID: "d3db4b4c-de2a-4504-ac79-1a6d59c9892e") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:44.478276 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:44.478242 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ldvlp" event={"ID":"0289f618-f4aa-4688-a261-c755d1a71444","Type":"ContainerStarted","Data":"56a1c099bbe3aecb1191f2ee4dbd730674cc71bcdf0681b80b5e4d9aa92e30ce"} Apr 22 17:53:44.497705 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:44.497657 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ldvlp" podStartSLOduration=39.638719356 podStartE2EDuration="44.497642737s" podCreationTimestamp="2026-04-22 17:53:00 +0000 UTC" firstStartedPulling="2026-04-22 17:53:38.618587848 +0000 UTC m=+46.927219329" lastFinishedPulling="2026-04-22 17:53:43.477511241 +0000 UTC m=+51.786142710" observedRunningTime="2026-04-22 17:53:44.496417985 +0000 UTC m=+52.805049509" watchObservedRunningTime="2026-04-22 17:53:44.497642737 +0000 UTC m=+52.806274223" Apr 22 17:53:45.071186 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.071147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:45.071391 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:45.071296 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:45.071391 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:45.071380 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls podName:d3db4b4c-de2a-4504-ac79-1a6d59c9892e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:49.071357395 +0000 UTC m=+57.379988863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-77tjc" (UID: "d3db4b4c-de2a-4504-ac79-1a6d59c9892e") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:45.415329 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.415302 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:45.415485 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.415337 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:53:45.415685 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.415673 2568 scope.go:117] "RemoveContainer" containerID="16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1" Apr 22 17:53:45.415889 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:45.415873 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-7v4cv_openshift-console-operator(0972f1d3-8168-44be-896c-c3d80cd4c9d7)\"" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" podUID="0972f1d3-8168-44be-896c-c3d80cd4c9d7" Apr 22 17:53:45.482614 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.482585 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerStarted","Data":"dd57489737c0cdb864d83456e0610354c8339ac6c51a46ebd48ac34b38b28c43"} Apr 22 17:53:45.483022 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.482622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerStarted","Data":"c8c1e12cc9bdd262dd4a8da87b078bdb665bb3574dae072a7a802bedb12b91c0"} Apr 22 17:53:45.501255 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:45.501211 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" podStartSLOduration=12.498809806 podStartE2EDuration="31.501198221s" podCreationTimestamp="2026-04-22 17:53:14 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.258095284 +0000 UTC m=+34.566726752" lastFinishedPulling="2026-04-22 17:53:45.2604837 +0000 UTC m=+53.569115167" observedRunningTime="2026-04-22 17:53:45.500721446 +0000 UTC m=+53.809352934" watchObservedRunningTime="2026-04-22 17:53:45.501198221 +0000 UTC m=+53.809829708" Apr 22 17:53:49.109200 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:49.109164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:49.109566 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:49.109322 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 17:53:49.109566 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:53:49.109390 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls podName:d3db4b4c-de2a-4504-ac79-1a6d59c9892e nodeName:}" failed. No retries permitted until 2026-04-22 17:53:57.109374145 +0000 UTC m=+65.418005610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-77tjc" (UID: "d3db4b4c-de2a-4504-ac79-1a6d59c9892e") : secret "insights-runtime-extractor-tls" not found Apr 22 17:53:51.338942 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:51.338913 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fvkcv" Apr 22 17:53:56.975656 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:56.975615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:56.978016 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:56.977991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"image-registry-8599fb6f6c-qtwn7\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:57.076198 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.076163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:57.076198 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.076207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:57.076453 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.076247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:57.076876 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.076851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/224a42db-ff4d-4e18-a064-b7f2a7b10e91-service-ca-bundle\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:57.078652 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.078634 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/224a42db-ff4d-4e18-a064-b7f2a7b10e91-metrics-certs\") pod \"router-default-b747876cb-7f77q\" (UID: \"224a42db-ff4d-4e18-a064-b7f2a7b10e91\") " pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:57.078710 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.078694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fa19e254-4e3d-4822-81d3-7ea095625185-cert\") pod \"ingress-canary-rnpt6\" (UID: \"fa19e254-4e3d-4822-81d3-7ea095625185\") " pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:57.177291 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.177255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:57.177454 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.177369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:57.177454 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.177402 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:57.179770 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.179717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d3db4b4c-de2a-4504-ac79-1a6d59c9892e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-77tjc\" (UID: \"d3db4b4c-de2a-4504-ac79-1a6d59c9892e\") " pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:57.179893 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.179814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/28c65550-3cca-4589-82a4-baaf985beda6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pmdgh\" (UID: \"28c65550-3cca-4589-82a4-baaf985beda6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:57.179893 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.179838 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3267974c-a8ce-4fa1-98cf-6213634080a0-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-lv8r8\" (UID: \"3267974c-a8ce-4fa1-98cf-6213634080a0\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:57.209584 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.209558 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-tlhcz\"" Apr 22 17:53:57.217774 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.217758 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:57.223113 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.223083 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-gh445\"" Apr 22 17:53:57.231055 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.231004 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:57.261400 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.261370 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dchxj\"" Apr 22 17:53:57.269851 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.269821 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rnpt6" Apr 22 17:53:57.278615 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.278133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:57.283162 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.281443 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8f0708c3-8b05-45e1-9d30-ca3772151671-metrics-tls\") pod \"dns-default-zs6sw\" (UID: \"8f0708c3-8b05-45e1-9d30-ca3772151671\") " pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:57.296390 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.294999 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-dqcf6\"" Apr 22 17:53:57.303147 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.303119 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-77tjc" Apr 22 17:53:57.343930 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.343821 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-rkw79\"" Apr 22 17:53:57.352335 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.351719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" Apr 22 17:53:57.368036 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.368002 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:53:57.370844 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.370811 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-kwq7h\"" Apr 22 17:53:57.377586 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:57.377398 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f87f2c_183f_4d31_91cd_2752918acc59.slice/crio-4a7fb5240b77076684bfcf9a8b3fdb58f95701cfd718df755acf3f9f3b32c07b WatchSource:0}: Error finding container 4a7fb5240b77076684bfcf9a8b3fdb58f95701cfd718df755acf3f9f3b32c07b: Status 404 returned error can't find the container with id 4a7fb5240b77076684bfcf9a8b3fdb58f95701cfd718df755acf3f9f3b32c07b Apr 22 17:53:57.378683 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.378623 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" Apr 22 17:53:57.384934 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.384888 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b747876cb-7f77q"] Apr 22 17:53:57.391454 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:57.391393 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod224a42db_ff4d_4e18_a064_b7f2a7b10e91.slice/crio-39f4ff5f4207657d23b5d3ee70d3afac8c6ac9c626300c276597726eef08ca87 WatchSource:0}: Error finding container 39f4ff5f4207657d23b5d3ee70d3afac8c6ac9c626300c276597726eef08ca87: Status 404 returned error can't find the container with id 39f4ff5f4207657d23b5d3ee70d3afac8c6ac9c626300c276597726eef08ca87 Apr 22 17:53:57.413495 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.413464 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rnpt6"] Apr 22 17:53:57.441569 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.441541 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rhqnx\"" Apr 22 17:53:57.450870 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.450619 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zs6sw" Apr 22 17:53:57.451462 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.451342 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-77tjc"] Apr 22 17:53:57.511429 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.511405 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh"] Apr 22 17:53:57.514874 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:57.514846 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c65550_3cca_4589_82a4_baaf985beda6.slice/crio-f1031d7b07b4b549cfb3185cf32fdc83d524552f906aef95a6d0f5ce647d08ad WatchSource:0}: Error finding container f1031d7b07b4b549cfb3185cf32fdc83d524552f906aef95a6d0f5ce647d08ad: Status 404 returned error can't find the container with id f1031d7b07b4b549cfb3185cf32fdc83d524552f906aef95a6d0f5ce647d08ad Apr 22 17:53:57.515294 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.515265 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-77tjc" event={"ID":"d3db4b4c-de2a-4504-ac79-1a6d59c9892e","Type":"ContainerStarted","Data":"ac90c15a8708d3094f9cc901b60c95bbc5f3a1c10477bc247180b972b504b949"} Apr 22 17:53:57.517338 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.517131 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" event={"ID":"75f87f2c-183f-4d31-91cd-2752918acc59","Type":"ContainerStarted","Data":"4a7fb5240b77076684bfcf9a8b3fdb58f95701cfd718df755acf3f9f3b32c07b"} Apr 22 17:53:57.519706 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.519158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rnpt6" event={"ID":"fa19e254-4e3d-4822-81d3-7ea095625185","Type":"ContainerStarted","Data":"d3b7909c23592a54b552def6191753a950d7064034b04392e863ba9b45352553"} Apr 22 17:53:57.520506 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.520474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b747876cb-7f77q" event={"ID":"224a42db-ff4d-4e18-a064-b7f2a7b10e91","Type":"ContainerStarted","Data":"39f4ff5f4207657d23b5d3ee70d3afac8c6ac9c626300c276597726eef08ca87"} Apr 22 17:53:57.534361 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.534343 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8"] Apr 22 17:53:57.628398 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.628258 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zs6sw"] Apr 22 17:53:57.631342 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:57.631298 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0708c3_8b05_45e1_9d30_ca3772151671.slice/crio-ec324847f35c74024d456fe94aa8ef38c8358be15f3392c2c03e1fd673d28ffd WatchSource:0}: Error finding container ec324847f35c74024d456fe94aa8ef38c8358be15f3392c2c03e1fd673d28ffd: Status 404 returned error can't find the container with id ec324847f35c74024d456fe94aa8ef38c8358be15f3392c2c03e1fd673d28ffd Apr 22 17:53:57.883633 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.883547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:57.886165 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.886137 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab99124f-2959-4b17-ab76-24041f074fe5-metrics-certs\") pod \"network-metrics-daemon-k7kpf\" (UID: \"ab99124f-2959-4b17-ab76-24041f074fe5\") " pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:57.949388 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.949352 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7sslk\"" Apr 22 17:53:57.957016 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:57.956992 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-k7kpf" Apr 22 17:53:58.109634 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.109598 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-k7kpf"] Apr 22 17:53:58.113509 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:53:58.113477 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab99124f_2959_4b17_ab76_24041f074fe5.slice/crio-5dd843f8203b9b9dccebca14a9139afc0273d1189792e25f513841ed4dbc59b8 WatchSource:0}: Error finding container 5dd843f8203b9b9dccebca14a9139afc0273d1189792e25f513841ed4dbc59b8: Status 404 returned error can't find the container with id 5dd843f8203b9b9dccebca14a9139afc0273d1189792e25f513841ed4dbc59b8 Apr 22 17:53:58.529995 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.529925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" event={"ID":"3267974c-a8ce-4fa1-98cf-6213634080a0","Type":"ContainerStarted","Data":"ead48d13c4ffb2029161ea19cce46be8ccdb7f6275bc0570937ac474b45a1ac1"} Apr 22 17:53:58.534046 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.533513 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" event={"ID":"75f87f2c-183f-4d31-91cd-2752918acc59","Type":"ContainerStarted","Data":"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21"} Apr 22 17:53:58.534046 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.533673 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:53:58.537450 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.536987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b747876cb-7f77q" event={"ID":"224a42db-ff4d-4e18-a064-b7f2a7b10e91","Type":"ContainerStarted","Data":"38b4c6277deadaa316a8de16853e92564042c35ca96f56ad31980e3d0eecfe41"} Apr 22 17:53:58.541016 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.540970 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-77tjc" event={"ID":"d3db4b4c-de2a-4504-ac79-1a6d59c9892e","Type":"ContainerStarted","Data":"9cfe51f0bdf5a02292d4eca55e1af306239751155cc5302346258c47d5a2bccf"} Apr 22 17:53:58.543570 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.543467 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7kpf" event={"ID":"ab99124f-2959-4b17-ab76-24041f074fe5","Type":"ContainerStarted","Data":"5dd843f8203b9b9dccebca14a9139afc0273d1189792e25f513841ed4dbc59b8"} Apr 22 17:53:58.545033 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.544978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs6sw" event={"ID":"8f0708c3-8b05-45e1-9d30-ca3772151671","Type":"ContainerStarted","Data":"ec324847f35c74024d456fe94aa8ef38c8358be15f3392c2c03e1fd673d28ffd"} Apr 22 17:53:58.547348 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.547315 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" event={"ID":"28c65550-3cca-4589-82a4-baaf985beda6","Type":"ContainerStarted","Data":"f1031d7b07b4b549cfb3185cf32fdc83d524552f906aef95a6d0f5ce647d08ad"} Apr 22 17:53:58.554008 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.553897 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" podStartSLOduration=66.553883264 podStartE2EDuration="1m6.553883264s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:58.553836415 +0000 UTC m=+66.862467910" watchObservedRunningTime="2026-04-22 17:53:58.553883264 +0000 UTC m=+66.862514753" Apr 22 17:53:58.572419 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:58.571910 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b747876cb-7f77q" podStartSLOduration=37.571895224 podStartE2EDuration="37.571895224s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:58.571761717 +0000 UTC m=+66.880393209" watchObservedRunningTime="2026-04-22 17:53:58.571895224 +0000 UTC m=+66.880526714" Apr 22 17:53:59.231906 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:59.231859 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:59.234882 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:59.234818 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:59.550083 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:59.550000 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:53:59.551182 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:53:59.551158 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b747876cb-7f77q" Apr 22 17:54:00.208355 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:00.208328 2568 scope.go:117] "RemoveContainer" containerID="16473114c688483487274076bdc2e61f1c02272ef01387703b4c432868a310b1" Apr 22 17:54:02.562180 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.562010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs6sw" event={"ID":"8f0708c3-8b05-45e1-9d30-ca3772151671","Type":"ContainerStarted","Data":"be365cda022c7ebdade6403877b32ee54b818e33d97b1c0af0f943f9e70dbdda"} Apr 22 17:54:02.564882 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.564749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" event={"ID":"28c65550-3cca-4589-82a4-baaf985beda6","Type":"ContainerStarted","Data":"96bee0c2ac2a725a5255e4abb9321fa45b94093e0aa3e4be7151c63e63d27212"} Apr 22 17:54:02.572159 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.572099 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" event={"ID":"3267974c-a8ce-4fa1-98cf-6213634080a0","Type":"ContainerStarted","Data":"d90141d920687ec838c60df2d605aeb599bda5eeb80e6feeb544ecd3739111e2"} Apr 22 17:54:02.572159 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.572130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" event={"ID":"3267974c-a8ce-4fa1-98cf-6213634080a0","Type":"ContainerStarted","Data":"3030f6ec6b5af909a4e24c0517da084bb72c0e4b3e06f515afadd1234259c7cd"} Apr 22 17:54:02.576930 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.576884 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 17:54:02.577039 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.576972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" event={"ID":"0972f1d3-8168-44be-896c-c3d80cd4c9d7","Type":"ContainerStarted","Data":"c58a0b823a97c63723976d4aa8d3a894d58da800b04fbfd92fa399e1add2f4ef"} Apr 22 17:54:02.579998 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.579959 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rnpt6" event={"ID":"fa19e254-4e3d-4822-81d3-7ea095625185","Type":"ContainerStarted","Data":"327adcfccde031fed452ef43c27d466cb3feef0b4cc6d4f9c40657a7d9cd25de"} Apr 22 17:54:02.585136 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.584880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-77tjc" event={"ID":"d3db4b4c-de2a-4504-ac79-1a6d59c9892e","Type":"ContainerStarted","Data":"90a0143f789f80b6926c7fef532718bc68f7244a2b2f9505429d0175ed645679"} Apr 22 17:54:02.591753 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.591449 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7kpf" event={"ID":"ab99124f-2959-4b17-ab76-24041f074fe5","Type":"ContainerStarted","Data":"03b99fb83d2f347792a0956d54034945b7b3c7505211ad78ef23e3e8db8512b0"} Apr 22 17:54:02.593140 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.592068 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pmdgh" podStartSLOduration=36.943852674 podStartE2EDuration="41.592053738s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.516818834 +0000 UTC m=+65.825450313" lastFinishedPulling="2026-04-22 17:54:02.165019903 +0000 UTC m=+70.473651377" observedRunningTime="2026-04-22 17:54:02.591710605 +0000 UTC m=+70.900342094" watchObservedRunningTime="2026-04-22 17:54:02.592053738 +0000 UTC m=+70.900685225" Apr 22 17:54:02.593140 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.593107 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:54:02.622278 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.621118 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" podStartSLOduration=29.386747043 podStartE2EDuration="41.621103603s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:26.258690768 +0000 UTC m=+34.567322240" lastFinishedPulling="2026-04-22 17:53:38.493047321 +0000 UTC m=+46.801678800" observedRunningTime="2026-04-22 17:54:02.620292769 +0000 UTC m=+70.928924282" watchObservedRunningTime="2026-04-22 17:54:02.621103603 +0000 UTC m=+70.929735090" Apr 22 17:54:02.646613 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.646549 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rnpt6" podStartSLOduration=32.905075556 podStartE2EDuration="37.646529408s" podCreationTimestamp="2026-04-22 17:53:25 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.423538113 +0000 UTC m=+65.732169598" lastFinishedPulling="2026-04-22 17:54:02.164991976 +0000 UTC m=+70.473623450" observedRunningTime="2026-04-22 17:54:02.645311305 +0000 UTC m=+70.953942794" watchObservedRunningTime="2026-04-22 17:54:02.646529408 +0000 UTC m=+70.955160897" Apr 22 17:54:02.750814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:02.750599 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-7v4cv" Apr 22 17:54:03.596379 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.596342 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zs6sw" event={"ID":"8f0708c3-8b05-45e1-9d30-ca3772151671","Type":"ContainerStarted","Data":"cf57841009f1f6c84ba9897cc43f2ac6492f8300d997a4840242bc53115dce60"} Apr 22 17:54:03.596877 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.596415 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zs6sw" Apr 22 17:54:03.598257 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.598187 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-k7kpf" event={"ID":"ab99124f-2959-4b17-ab76-24041f074fe5","Type":"ContainerStarted","Data":"3f34c12fcd4d0bdf436248d52024adc833d9699c054878f42476c467a1f8fd58"} Apr 22 17:54:03.631467 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.631420 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zs6sw" podStartSLOduration=34.100886404 podStartE2EDuration="38.631402349s" podCreationTimestamp="2026-04-22 17:53:25 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.634487213 +0000 UTC m=+65.943118693" lastFinishedPulling="2026-04-22 17:54:02.165003159 +0000 UTC m=+70.473634638" observedRunningTime="2026-04-22 17:54:03.629012488 +0000 UTC m=+71.937644000" watchObservedRunningTime="2026-04-22 17:54:03.631402349 +0000 UTC m=+71.940033837" Apr 22 17:54:03.657180 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.657120 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-k7kpf" podStartSLOduration=67.606872621 podStartE2EDuration="1m11.657100219s" podCreationTimestamp="2026-04-22 17:52:52 +0000 UTC" firstStartedPulling="2026-04-22 17:53:58.116065173 +0000 UTC m=+66.424696641" lastFinishedPulling="2026-04-22 17:54:02.16629276 +0000 UTC m=+70.474924239" observedRunningTime="2026-04-22 17:54:03.655189911 +0000 UTC m=+71.963821399" watchObservedRunningTime="2026-04-22 17:54:03.657100219 +0000 UTC m=+71.965731707" Apr 22 17:54:03.706248 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:03.706187 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-lv8r8" podStartSLOduration=38.157374706 podStartE2EDuration="42.706170643s" podCreationTimestamp="2026-04-22 17:53:21 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.616198697 +0000 UTC m=+65.924830176" lastFinishedPulling="2026-04-22 17:54:02.164994635 +0000 UTC m=+70.473626113" observedRunningTime="2026-04-22 17:54:03.706048929 +0000 UTC m=+72.014680416" watchObservedRunningTime="2026-04-22 17:54:03.706170643 +0000 UTC m=+72.014802133" Apr 22 17:54:04.603761 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:04.603681 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-77tjc" event={"ID":"d3db4b4c-de2a-4504-ac79-1a6d59c9892e","Type":"ContainerStarted","Data":"14e813c91cbb998fecf0e695ceaa6770fb131e350348308733910f3173434d03"} Apr 22 17:54:04.621741 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:04.621680 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-77tjc" podStartSLOduration=17.054377055 podStartE2EDuration="23.621667096s" podCreationTimestamp="2026-04-22 17:53:41 +0000 UTC" firstStartedPulling="2026-04-22 17:53:57.618159724 +0000 UTC m=+65.926791198" lastFinishedPulling="2026-04-22 17:54:04.185449764 +0000 UTC m=+72.494081239" observedRunningTime="2026-04-22 17:54:04.621175277 +0000 UTC m=+72.929806764" watchObservedRunningTime="2026-04-22 17:54:04.621667096 +0000 UTC m=+72.930298635" Apr 22 17:54:11.470175 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:11.470143 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4phwt" Apr 22 17:54:12.589703 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.589673 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fvnc2"] Apr 22 17:54:12.594373 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.594350 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.597753 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.597712 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:54:12.597888 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.597767 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:54:12.597888 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.597808 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-zb7n5\"" Apr 22 17:54:12.598828 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.598811 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:54:12.598931 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.598841 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:54:12.706648 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706611 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-tls\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706648 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706649 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4wsc\" (UniqueName: \"kubernetes.io/projected/97f2c808-a28a-451a-ac4a-bd5f265698e3-kube-api-access-c4wsc\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-sys\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-textfile\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-wtmp\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706828 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-metrics-client-ca\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.706903 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.706867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-root\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807453 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807453 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807457 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-metrics-client-ca\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807649 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-root\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807693 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-tls\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807764 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4wsc\" (UniqueName: \"kubernetes.io/projected/97f2c808-a28a-451a-ac4a-bd5f265698e3-kube-api-access-c4wsc\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807824 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807790 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-root\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807896 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807869 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-sys\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.807950 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807916 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808009 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-textfile\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808009 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.807954 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-sys\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808111 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.808061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-wtmp\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808252 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.808231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-metrics-client-ca\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808330 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.808258 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-wtmp\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808330 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.808234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-textfile\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.808494 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.808465 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-accelerators-collector-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.810025 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.810006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.810148 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.810128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/97f2c808-a28a-451a-ac4a-bd5f265698e3-node-exporter-tls\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.815217 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.815195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4wsc\" (UniqueName: \"kubernetes.io/projected/97f2c808-a28a-451a-ac4a-bd5f265698e3-kube-api-access-c4wsc\") pod \"node-exporter-fvnc2\" (UID: \"97f2c808-a28a-451a-ac4a-bd5f265698e3\") " pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.904371 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:12.904317 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fvnc2" Apr 22 17:54:12.912703 ip-10-0-142-118 kubenswrapper[2568]: W0422 17:54:12.912680 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f2c808_a28a_451a_ac4a_bd5f265698e3.slice/crio-18091542fe09f0ec24b55052504910822406268f46c6a71ac78b697ad1d8e589 WatchSource:0}: Error finding container 18091542fe09f0ec24b55052504910822406268f46c6a71ac78b697ad1d8e589: Status 404 returned error can't find the container with id 18091542fe09f0ec24b55052504910822406268f46c6a71ac78b697ad1d8e589 Apr 22 17:54:13.606613 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:13.606582 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zs6sw" Apr 22 17:54:13.632484 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:13.632456 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fvnc2" event={"ID":"97f2c808-a28a-451a-ac4a-bd5f265698e3","Type":"ContainerStarted","Data":"18091542fe09f0ec24b55052504910822406268f46c6a71ac78b697ad1d8e589"} Apr 22 17:54:14.637461 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:14.637428 2568 generic.go:358] "Generic (PLEG): container finished" podID="97f2c808-a28a-451a-ac4a-bd5f265698e3" containerID="871be28590d3c2eeac8f2d7b984bb920d079eb889c1f165dfd8c2bdc2b4d73bc" exitCode=0 Apr 22 17:54:14.637839 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:14.637513 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fvnc2" event={"ID":"97f2c808-a28a-451a-ac4a-bd5f265698e3","Type":"ContainerDied","Data":"871be28590d3c2eeac8f2d7b984bb920d079eb889c1f165dfd8c2bdc2b4d73bc"} Apr 22 17:54:15.642394 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:15.642357 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fvnc2" event={"ID":"97f2c808-a28a-451a-ac4a-bd5f265698e3","Type":"ContainerStarted","Data":"982594ea2949fe86b2660f7c0d5c08b1799f25f115a2e50a3cedd0b600077362"} Apr 22 17:54:15.642394 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:15.642393 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fvnc2" event={"ID":"97f2c808-a28a-451a-ac4a-bd5f265698e3","Type":"ContainerStarted","Data":"2c9e38a3762891130e00606d253fec7166f3473c0d53bca36fca20d93ff82987"} Apr 22 17:54:15.662809 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:15.662758 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fvnc2" podStartSLOduration=2.75619268 podStartE2EDuration="3.662720993s" podCreationTimestamp="2026-04-22 17:54:12 +0000 UTC" firstStartedPulling="2026-04-22 17:54:12.914369947 +0000 UTC m=+81.223001412" lastFinishedPulling="2026-04-22 17:54:13.820898246 +0000 UTC m=+82.129529725" observedRunningTime="2026-04-22 17:54:15.660413061 +0000 UTC m=+83.969044547" watchObservedRunningTime="2026-04-22 17:54:15.662720993 +0000 UTC m=+83.971352480" Apr 22 17:54:17.222667 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:17.222624 2568 patch_prober.go:28] interesting pod/image-registry-8599fb6f6c-qtwn7 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 17:54:17.223067 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:17.222688 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 17:54:19.554582 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:19.554554 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:54:24.843268 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:24.843239 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:54:49.747656 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:49.747624 2568 generic.go:358] "Generic (PLEG): container finished" podID="99308cc1-5395-417c-bf2d-54fe0c5411d7" containerID="37b807ec5f69f82d60408ee38e836c237eb66fc2f9352dc3b24847531588442a" exitCode=0 Apr 22 17:54:49.748027 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:49.747662 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" event={"ID":"99308cc1-5395-417c-bf2d-54fe0c5411d7","Type":"ContainerDied","Data":"37b807ec5f69f82d60408ee38e836c237eb66fc2f9352dc3b24847531588442a"} Apr 22 17:54:49.748027 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:49.747952 2568 scope.go:117] "RemoveContainer" containerID="37b807ec5f69f82d60408ee38e836c237eb66fc2f9352dc3b24847531588442a" Apr 22 17:54:49.866561 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:49.866516 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" containerName="registry" containerID="cri-o://c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21" gracePeriod=30 Apr 22 17:54:50.107235 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.107207 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:54:50.195892 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.195858 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196065 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.195903 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196065 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.195930 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196065 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.195965 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196065 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196052 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196281 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196092 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196281 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196150 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196281 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196191 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4k64\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64\") pod \"75f87f2c-183f-4d31-91cd-2752918acc59\" (UID: \"75f87f2c-183f-4d31-91cd-2752918acc59\") " Apr 22 17:54:50.196430 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196277 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:50.196430 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196330 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:54:50.196535 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196449 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-trusted-ca\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.196535 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.196467 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75f87f2c-183f-4d31-91cd-2752918acc59-registry-certificates\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.198923 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.198886 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:50.199052 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.198922 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:50.199052 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.199031 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:54:50.199165 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.199075 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:50.199165 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.199153 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64" (OuterVolumeSpecName: "kube-api-access-x4k64") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "kube-api-access-x4k64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:54:50.203924 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.203903 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "75f87f2c-183f-4d31-91cd-2752918acc59" (UID: "75f87f2c-183f-4d31-91cd-2752918acc59"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297750 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75f87f2c-183f-4d31-91cd-2752918acc59-ca-trust-extracted\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297777 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-installation-pull-secrets\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297788 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-bound-sa-token\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297800 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/75f87f2c-183f-4d31-91cd-2752918acc59-image-registry-private-configuration\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297809 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4k64\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-kube-api-access-x4k64\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.297814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.297818 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75f87f2c-183f-4d31-91cd-2752918acc59-registry-tls\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 17:54:50.752400 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.752361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-kwxxz" event={"ID":"99308cc1-5395-417c-bf2d-54fe0c5411d7","Type":"ContainerStarted","Data":"5ef2da29d1cbe2ba8cfbfa10545f4b1d762deb72f1bdb9b3e324532e735fea44"} Apr 22 17:54:50.753439 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.753414 2568 generic.go:358] "Generic (PLEG): container finished" podID="75f87f2c-183f-4d31-91cd-2752918acc59" containerID="c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21" exitCode=0 Apr 22 17:54:50.753544 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.753469 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" Apr 22 17:54:50.753544 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.753496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" event={"ID":"75f87f2c-183f-4d31-91cd-2752918acc59","Type":"ContainerDied","Data":"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21"} Apr 22 17:54:50.753544 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.753534 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8599fb6f6c-qtwn7" event={"ID":"75f87f2c-183f-4d31-91cd-2752918acc59","Type":"ContainerDied","Data":"4a7fb5240b77076684bfcf9a8b3fdb58f95701cfd718df755acf3f9f3b32c07b"} Apr 22 17:54:50.753655 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.753552 2568 scope.go:117] "RemoveContainer" containerID="c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21" Apr 22 17:54:50.764892 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.764860 2568 scope.go:117] "RemoveContainer" containerID="c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21" Apr 22 17:54:50.765156 ip-10-0-142-118 kubenswrapper[2568]: E0422 17:54:50.765132 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21\": container with ID starting with c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21 not found: ID does not exist" containerID="c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21" Apr 22 17:54:50.765207 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.765168 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21"} err="failed to get container status \"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21\": rpc error: code = NotFound desc = could not find container \"c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21\": container with ID starting with c542b1e36dec5312cbefc1dfcf048a1b365ed28400d16ec3fb500bfa079b9a21 not found: ID does not exist" Apr 22 17:54:50.785195 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.785167 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:54:50.791349 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:50.791327 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8599fb6f6c-qtwn7"] Apr 22 17:54:52.212264 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:52.212231 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" path="/var/lib/kubelet/pods/75f87f2c-183f-4d31-91cd-2752918acc59/volumes" Apr 22 17:54:59.780783 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:59.780743 2568 generic.go:358] "Generic (PLEG): container finished" podID="75454aa1-9f9c-481a-b5e0-248d97ce5213" containerID="278c13a16c0448b0f3eb2e576339affd86dfa62b95e347188aa2d529d74ab6ca" exitCode=0 Apr 22 17:54:59.781191 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:59.780800 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vkb44" event={"ID":"75454aa1-9f9c-481a-b5e0-248d97ce5213","Type":"ContainerDied","Data":"278c13a16c0448b0f3eb2e576339affd86dfa62b95e347188aa2d529d74ab6ca"} Apr 22 17:54:59.781191 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:54:59.781115 2568 scope.go:117] "RemoveContainer" containerID="278c13a16c0448b0f3eb2e576339affd86dfa62b95e347188aa2d529d74ab6ca" Apr 22 17:55:00.183773 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:00.183685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b747876cb-7f77q_224a42db-ff4d-4e18-a064-b7f2a7b10e91/router/0.log" Apr 22 17:55:00.196180 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:00.196158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rnpt6_fa19e254-4e3d-4822-81d3-7ea095625185/serve-healthcheck-canary/0.log" Apr 22 17:55:00.785298 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:00.785261 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vkb44" event={"ID":"75454aa1-9f9c-481a-b5e0-248d97ce5213","Type":"ContainerStarted","Data":"2cc9faa768b37a4a0af95a7570b154ac41112760b8f1e418dca0333b8274f966"} Apr 22 17:55:14.824407 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:14.824367 2568 generic.go:358] "Generic (PLEG): container finished" podID="242c5d0e-f778-473e-a7b6-3a94132fea7c" containerID="6e1ed20e77a48b13b446bd441bc034bcee88b2dfd4e6ce70d356157aac5f13a5" exitCode=0 Apr 22 17:55:14.824827 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:14.824444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" event={"ID":"242c5d0e-f778-473e-a7b6-3a94132fea7c","Type":"ContainerDied","Data":"6e1ed20e77a48b13b446bd441bc034bcee88b2dfd4e6ce70d356157aac5f13a5"} Apr 22 17:55:14.824827 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:14.824811 2568 scope.go:117] "RemoveContainer" containerID="6e1ed20e77a48b13b446bd441bc034bcee88b2dfd4e6ce70d356157aac5f13a5" Apr 22 17:55:15.621265 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:15.621228 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" podUID="7cd04bd0-8da0-4aa1-8212-af5aa3c652d6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:55:15.828456 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:15.828421 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sgbxb" event={"ID":"242c5d0e-f778-473e-a7b6-3a94132fea7c","Type":"ContainerStarted","Data":"317064e2c6d6057e2b739a99492920bca207fc5b72ff6d31e8bc36555315e396"} Apr 22 17:55:25.621279 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:25.621234 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" podUID="7cd04bd0-8da0-4aa1-8212-af5aa3c652d6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:55:35.621167 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.621128 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" podUID="7cd04bd0-8da0-4aa1-8212-af5aa3c652d6" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 17:55:35.621639 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.621202 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" Apr 22 17:55:35.621639 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.621631 2568 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"dd57489737c0cdb864d83456e0610354c8339ac6c51a46ebd48ac34b38b28c43"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 17:55:35.621713 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.621664 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" podUID="7cd04bd0-8da0-4aa1-8212-af5aa3c652d6" containerName="service-proxy" containerID="cri-o://dd57489737c0cdb864d83456e0610354c8339ac6c51a46ebd48ac34b38b28c43" gracePeriod=30 Apr 22 17:55:35.887826 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.887756 2568 generic.go:358] "Generic (PLEG): container finished" podID="7cd04bd0-8da0-4aa1-8212-af5aa3c652d6" containerID="dd57489737c0cdb864d83456e0610354c8339ac6c51a46ebd48ac34b38b28c43" exitCode=2 Apr 22 17:55:35.887826 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.887767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerDied","Data":"dd57489737c0cdb864d83456e0610354c8339ac6c51a46ebd48ac34b38b28c43"} Apr 22 17:55:35.887826 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:55:35.887800 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-86784fd9d-bgfcr" event={"ID":"7cd04bd0-8da0-4aa1-8212-af5aa3c652d6","Type":"ContainerStarted","Data":"dff1014a4ac1eb374e347fb60118ab8e6364da0b32967e4105b5886f63fdec93"} Apr 22 17:57:52.117256 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:57:52.117223 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 17:57:52.118088 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:57:52.118070 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 17:57:52.125814 ip-10-0-142-118 kubenswrapper[2568]: I0422 17:57:52.125791 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:01:32.623496 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.623459 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2"] Apr 22 18:01:32.624019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.623767 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" containerName="registry" Apr 22 18:01:32.624019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.623778 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" containerName="registry" Apr 22 18:01:32.624019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.623828 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="75f87f2c-183f-4d31-91cd-2752918acc59" containerName="registry" Apr 22 18:01:32.626531 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.626515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.629626 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.629605 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:01:32.629775 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.629639 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:01:32.631045 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.631030 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:01:32.631150 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.631050 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:01:32.631150 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.631079 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wjvx9\"" Apr 22 18:01:32.631320 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.631301 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:01:32.641375 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.637457 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2"] Apr 22 18:01:32.720680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.720635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46198b9a-ac49-4f44-8d9c-fc29591ae093-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.720680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.720683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.720943 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.720716 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48zd\" (UniqueName: \"kubernetes.io/projected/46198b9a-ac49-4f44-8d9c-fc29591ae093-kube-api-access-q48zd\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.720943 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.720794 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.821452 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.821420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.821587 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.821466 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46198b9a-ac49-4f44-8d9c-fc29591ae093-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.821587 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.821484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.821587 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.821517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q48zd\" (UniqueName: \"kubernetes.io/projected/46198b9a-ac49-4f44-8d9c-fc29591ae093-kube-api-access-q48zd\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.822222 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.822196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/46198b9a-ac49-4f44-8d9c-fc29591ae093-manager-config\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.823959 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.823935 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.824054 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.823978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/46198b9a-ac49-4f44-8d9c-fc29591ae093-metrics-cert\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.831927 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.831905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48zd\" (UniqueName: \"kubernetes.io/projected/46198b9a-ac49-4f44-8d9c-fc29591ae093-kube-api-access-q48zd\") pod \"lws-controller-manager-5b9bbc5c4d-cplp2\" (UID: \"46198b9a-ac49-4f44-8d9c-fc29591ae093\") " pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:32.935405 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:32.935328 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:33.051948 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:33.051923 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2"] Apr 22 18:01:33.054390 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:01:33.054365 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46198b9a_ac49_4f44_8d9c_fc29591ae093.slice/crio-d27c013a6143b3114f86250a0d74a02b1e11875679c03fe973280571e7a464c3 WatchSource:0}: Error finding container d27c013a6143b3114f86250a0d74a02b1e11875679c03fe973280571e7a464c3: Status 404 returned error can't find the container with id d27c013a6143b3114f86250a0d74a02b1e11875679c03fe973280571e7a464c3 Apr 22 18:01:33.056109 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:33.056092 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:01:33.854554 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:33.854513 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" event={"ID":"46198b9a-ac49-4f44-8d9c-fc29591ae093","Type":"ContainerStarted","Data":"d27c013a6143b3114f86250a0d74a02b1e11875679c03fe973280571e7a464c3"} Apr 22 18:01:35.861702 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:35.861666 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" event={"ID":"46198b9a-ac49-4f44-8d9c-fc29591ae093","Type":"ContainerStarted","Data":"67d389eb729d3e19bf77eacba312a2a25afc9db25ee204c3b9d6f7adaeacc709"} Apr 22 18:01:35.862220 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:35.861783 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:01:35.879668 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:35.879628 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" podStartSLOduration=1.578012352 podStartE2EDuration="3.879612209s" podCreationTimestamp="2026-04-22 18:01:32 +0000 UTC" firstStartedPulling="2026-04-22 18:01:33.056242746 +0000 UTC m=+521.364874212" lastFinishedPulling="2026-04-22 18:01:35.357842603 +0000 UTC m=+523.666474069" observedRunningTime="2026-04-22 18:01:35.87915267 +0000 UTC m=+524.187784157" watchObservedRunningTime="2026-04-22 18:01:35.879612209 +0000 UTC m=+524.188243698" Apr 22 18:01:46.866924 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:01:46.866845 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5b9bbc5c4d-cplp2" Apr 22 18:02:31.310514 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.310472 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn"] Apr 22 18:02:31.312909 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.312888 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:31.317570 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.317549 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:02:31.317659 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.317598 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:02:31.317745 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.317656 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-rsk5l\"" Apr 22 18:02:31.318693 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.318676 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:02:31.333581 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.333562 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn"] Apr 22 18:02:31.374891 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.374870 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khcrr\" (UniqueName: \"kubernetes.io/projected/090e3c02-96d2-479e-8871-e8358b3f1d4e-kube-api-access-khcrr\") pod \"dns-operator-controller-manager-844548ff4c-4psqn\" (UID: \"090e3c02-96d2-479e-8871-e8358b3f1d4e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:31.475207 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.475179 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khcrr\" (UniqueName: \"kubernetes.io/projected/090e3c02-96d2-479e-8871-e8358b3f1d4e-kube-api-access-khcrr\") pod \"dns-operator-controller-manager-844548ff4c-4psqn\" (UID: \"090e3c02-96d2-479e-8871-e8358b3f1d4e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:31.493425 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.493402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khcrr\" (UniqueName: \"kubernetes.io/projected/090e3c02-96d2-479e-8871-e8358b3f1d4e-kube-api-access-khcrr\") pod \"dns-operator-controller-manager-844548ff4c-4psqn\" (UID: \"090e3c02-96d2-479e-8871-e8358b3f1d4e\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:31.623100 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.623018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:31.745574 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:31.745543 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn"] Apr 22 18:02:31.749136 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:02:31.749104 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090e3c02_96d2_479e_8871_e8358b3f1d4e.slice/crio-c081f7736f8bf337955933b67f071501bfa3074ba0f732ca4275f49f2fa3370d WatchSource:0}: Error finding container c081f7736f8bf337955933b67f071501bfa3074ba0f732ca4275f49f2fa3370d: Status 404 returned error can't find the container with id c081f7736f8bf337955933b67f071501bfa3074ba0f732ca4275f49f2fa3370d Apr 22 18:02:32.022893 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:32.022863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" event={"ID":"090e3c02-96d2-479e-8871-e8358b3f1d4e","Type":"ContainerStarted","Data":"c081f7736f8bf337955933b67f071501bfa3074ba0f732ca4275f49f2fa3370d"} Apr 22 18:02:33.286356 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.286319 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s"] Apr 22 18:02:33.288798 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.288774 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.289186 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.289158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.289313 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.289202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcl64\" (UniqueName: \"kubernetes.io/projected/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-kube-api-access-jcl64\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.289313 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.289250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.291483 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.291390 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 18:02:33.291998 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.291914 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 18:02:33.291998 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.291939 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wdlw5\"" Apr 22 18:02:33.306761 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.305016 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s"] Apr 22 18:02:33.389845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.389814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.389845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.389847 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcl64\" (UniqueName: \"kubernetes.io/projected/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-kube-api-access-jcl64\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.390077 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.389876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.390077 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:02:33.389971 2568 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 18:02:33.390077 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:02:33.390062 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert podName:eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c nodeName:}" failed. No retries permitted until 2026-04-22 18:02:33.890039748 +0000 UTC m=+582.198671215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-6jf4s" (UID: "eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c") : secret "plugin-serving-cert" not found Apr 22 18:02:33.390443 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.390426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.398144 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.398119 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcl64\" (UniqueName: \"kubernetes.io/projected/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-kube-api-access-jcl64\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.893068 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.893034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.895574 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.895549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-6jf4s\" (UID: \"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:33.900405 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:33.900377 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" Apr 22 18:02:34.506503 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:34.506451 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s"] Apr 22 18:02:34.508956 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:02:34.508932 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb5a9fc9_46b6_459b_99df_2b4ffaa8e90c.slice/crio-f23b3e2cd0cfd058bb593a54b2aeebf822a9f9f478245f94fec796d44b0b7cb8 WatchSource:0}: Error finding container f23b3e2cd0cfd058bb593a54b2aeebf822a9f9f478245f94fec796d44b0b7cb8: Status 404 returned error can't find the container with id f23b3e2cd0cfd058bb593a54b2aeebf822a9f9f478245f94fec796d44b0b7cb8 Apr 22 18:02:35.034680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:35.034644 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" event={"ID":"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c","Type":"ContainerStarted","Data":"f23b3e2cd0cfd058bb593a54b2aeebf822a9f9f478245f94fec796d44b0b7cb8"} Apr 22 18:02:35.036030 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:35.035999 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" event={"ID":"090e3c02-96d2-479e-8871-e8358b3f1d4e","Type":"ContainerStarted","Data":"1645a695ecf4448f6fbc6f023272c8e68366a47c553b8278b3bc752060a26c6d"} Apr 22 18:02:35.036175 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:35.036161 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:35.067171 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:35.067127 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" podStartSLOduration=1.3951987639999999 podStartE2EDuration="4.067114305s" podCreationTimestamp="2026-04-22 18:02:31 +0000 UTC" firstStartedPulling="2026-04-22 18:02:31.751216122 +0000 UTC m=+580.059847590" lastFinishedPulling="2026-04-22 18:02:34.423131657 +0000 UTC m=+582.731763131" observedRunningTime="2026-04-22 18:02:35.066018336 +0000 UTC m=+583.374649825" watchObservedRunningTime="2026-04-22 18:02:35.067114305 +0000 UTC m=+583.375745791" Apr 22 18:02:40.054297 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:40.054262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" event={"ID":"eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c","Type":"ContainerStarted","Data":"175dd48de615cdbec31dca3699d2ba3d89737fdb17afead5ac64f4870a94417a"} Apr 22 18:02:40.071399 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:40.071347 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-6jf4s" podStartSLOduration=1.736059633 podStartE2EDuration="7.071329906s" podCreationTimestamp="2026-04-22 18:02:33 +0000 UTC" firstStartedPulling="2026-04-22 18:02:34.51030334 +0000 UTC m=+582.818934805" lastFinishedPulling="2026-04-22 18:02:39.845573605 +0000 UTC m=+588.154205078" observedRunningTime="2026-04-22 18:02:40.071094293 +0000 UTC m=+588.379725791" watchObservedRunningTime="2026-04-22 18:02:40.071329906 +0000 UTC m=+588.379961394" Apr 22 18:02:46.042648 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:46.042616 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-4psqn" Apr 22 18:02:52.140413 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:52.140386 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:02:52.140819 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:02:52.140424 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:03:14.149817 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.149740 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sttvs"] Apr 22 18:03:14.152759 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.152722 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.154954 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.154934 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:03:14.161642 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.161621 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sttvs"] Apr 22 18:03:14.175511 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.175483 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sttvs"] Apr 22 18:03:14.209176 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.209145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6l2g\" (UniqueName: \"kubernetes.io/projected/682a9f48-b938-41dd-8f69-4753f78876f5-kube-api-access-q6l2g\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.209329 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.209196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/682a9f48-b938-41dd-8f69-4753f78876f5-config-file\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.310542 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.310507 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/682a9f48-b938-41dd-8f69-4753f78876f5-config-file\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.310719 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.310594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6l2g\" (UniqueName: \"kubernetes.io/projected/682a9f48-b938-41dd-8f69-4753f78876f5-kube-api-access-q6l2g\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.311239 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.311218 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/682a9f48-b938-41dd-8f69-4753f78876f5-config-file\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.319190 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.319166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6l2g\" (UniqueName: \"kubernetes.io/projected/682a9f48-b938-41dd-8f69-4753f78876f5-kube-api-access-q6l2g\") pod \"limitador-limitador-67566c68b4-sttvs\" (UID: \"682a9f48-b938-41dd-8f69-4753f78876f5\") " pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.462488 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.462446 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:14.584623 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.584599 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-sttvs"] Apr 22 18:03:14.586405 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:03:14.586368 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod682a9f48_b938_41dd_8f69_4753f78876f5.slice/crio-8172b3dfc28102344ae69482bfacd3f8910ab54b43f2024050d7c3cc1b3cb1a2 WatchSource:0}: Error finding container 8172b3dfc28102344ae69482bfacd3f8910ab54b43f2024050d7c3cc1b3cb1a2: Status 404 returned error can't find the container with id 8172b3dfc28102344ae69482bfacd3f8910ab54b43f2024050d7c3cc1b3cb1a2 Apr 22 18:03:14.826120 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.826010 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:14.830416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.830399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:14.832930 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.832913 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-bljh8\"" Apr 22 18:03:14.836310 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.836289 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:14.915316 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:14.915284 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6f2\" (UniqueName: \"kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2\") pod \"authorino-79cbc94b89-q2ztd\" (UID: \"3fbe73be-865f-47bd-afff-0ed09bb71af8\") " pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:15.016094 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:15.016060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6f2\" (UniqueName: \"kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2\") pod \"authorino-79cbc94b89-q2ztd\" (UID: \"3fbe73be-865f-47bd-afff-0ed09bb71af8\") " pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:15.024999 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:15.024963 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6f2\" (UniqueName: \"kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2\") pod \"authorino-79cbc94b89-q2ztd\" (UID: \"3fbe73be-865f-47bd-afff-0ed09bb71af8\") " pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:15.140334 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:15.140247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:15.159686 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:15.159641 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" event={"ID":"682a9f48-b938-41dd-8f69-4753f78876f5","Type":"ContainerStarted","Data":"8172b3dfc28102344ae69482bfacd3f8910ab54b43f2024050d7c3cc1b3cb1a2"} Apr 22 18:03:15.270847 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:15.270820 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:15.273082 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:03:15.273053 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fbe73be_865f_47bd_afff_0ed09bb71af8.slice/crio-e034dedb7d5c1b3d97d6ad6ec3367afc52d8b106e2d581269a75c884155dab87 WatchSource:0}: Error finding container e034dedb7d5c1b3d97d6ad6ec3367afc52d8b106e2d581269a75c884155dab87: Status 404 returned error can't find the container with id e034dedb7d5c1b3d97d6ad6ec3367afc52d8b106e2d581269a75c884155dab87 Apr 22 18:03:16.164887 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:16.164859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" event={"ID":"3fbe73be-865f-47bd-afff-0ed09bb71af8","Type":"ContainerStarted","Data":"e034dedb7d5c1b3d97d6ad6ec3367afc52d8b106e2d581269a75c884155dab87"} Apr 22 18:03:17.168911 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:17.168867 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" event={"ID":"682a9f48-b938-41dd-8f69-4753f78876f5","Type":"ContainerStarted","Data":"54fb47a52608b6fd375b1a331b9f0044522febae6b88c8828a07469742559f84"} Apr 22 18:03:17.169317 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:17.168974 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:17.208881 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:17.208831 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" podStartSLOduration=1.687992778 podStartE2EDuration="3.208814592s" podCreationTimestamp="2026-04-22 18:03:14 +0000 UTC" firstStartedPulling="2026-04-22 18:03:14.588309978 +0000 UTC m=+622.896941443" lastFinishedPulling="2026-04-22 18:03:16.109131789 +0000 UTC m=+624.417763257" observedRunningTime="2026-04-22 18:03:17.207792456 +0000 UTC m=+625.516423945" watchObservedRunningTime="2026-04-22 18:03:17.208814592 +0000 UTC m=+625.517446191" Apr 22 18:03:19.177262 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:19.177224 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" event={"ID":"3fbe73be-865f-47bd-afff-0ed09bb71af8","Type":"ContainerStarted","Data":"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467"} Apr 22 18:03:19.192286 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:19.192235 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" podStartSLOduration=1.947087126 podStartE2EDuration="5.192221815s" podCreationTimestamp="2026-04-22 18:03:14 +0000 UTC" firstStartedPulling="2026-04-22 18:03:15.274333354 +0000 UTC m=+623.582964819" lastFinishedPulling="2026-04-22 18:03:18.519468028 +0000 UTC m=+626.828099508" observedRunningTime="2026-04-22 18:03:19.190813623 +0000 UTC m=+627.499445111" watchObservedRunningTime="2026-04-22 18:03:19.192221815 +0000 UTC m=+627.500853301" Apr 22 18:03:28.173293 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:28.173253 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-sttvs" Apr 22 18:03:37.479657 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.479620 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-2mdqj"] Apr 22 18:03:37.482980 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.482961 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.485492 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.485473 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 18:03:37.488236 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.488198 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-2mdqj"] Apr 22 18:03:37.508854 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.508819 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fv8\" (UniqueName: \"kubernetes.io/projected/69d51c27-f087-405f-99f9-cc012eb420cd-kube-api-access-76fv8\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.508991 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.508896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/69d51c27-f087-405f-99f9-cc012eb420cd-tls-cert\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.609464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.609430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fv8\" (UniqueName: \"kubernetes.io/projected/69d51c27-f087-405f-99f9-cc012eb420cd-kube-api-access-76fv8\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.609627 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.609473 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/69d51c27-f087-405f-99f9-cc012eb420cd-tls-cert\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.611836 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.611810 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/69d51c27-f087-405f-99f9-cc012eb420cd-tls-cert\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.617479 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.617444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fv8\" (UniqueName: \"kubernetes.io/projected/69d51c27-f087-405f-99f9-cc012eb420cd-kube-api-access-76fv8\") pod \"authorino-68bd676465-2mdqj\" (UID: \"69d51c27-f087-405f-99f9-cc012eb420cd\") " pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.793368 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.793285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-2mdqj" Apr 22 18:03:37.912920 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:37.912895 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-2mdqj"] Apr 22 18:03:37.915146 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:03:37.915113 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69d51c27_f087_405f_99f9_cc012eb420cd.slice/crio-46720bccca802171a4b34d412894eeaaf8f81227482896aeccbb53d5e282b391 WatchSource:0}: Error finding container 46720bccca802171a4b34d412894eeaaf8f81227482896aeccbb53d5e282b391: Status 404 returned error can't find the container with id 46720bccca802171a4b34d412894eeaaf8f81227482896aeccbb53d5e282b391 Apr 22 18:03:38.237360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:38.237323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-2mdqj" event={"ID":"69d51c27-f087-405f-99f9-cc012eb420cd","Type":"ContainerStarted","Data":"46720bccca802171a4b34d412894eeaaf8f81227482896aeccbb53d5e282b391"} Apr 22 18:03:39.242106 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.242069 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-2mdqj" event={"ID":"69d51c27-f087-405f-99f9-cc012eb420cd","Type":"ContainerStarted","Data":"6bdaf0b45c97a0baee6a6501b6f77b64c53419e4a17e12fd483fa0046a8f88c6"} Apr 22 18:03:39.258580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.258521 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-2mdqj" podStartSLOduration=1.728327217 podStartE2EDuration="2.258505396s" podCreationTimestamp="2026-04-22 18:03:37 +0000 UTC" firstStartedPulling="2026-04-22 18:03:37.916491549 +0000 UTC m=+646.225123014" lastFinishedPulling="2026-04-22 18:03:38.446669725 +0000 UTC m=+646.755301193" observedRunningTime="2026-04-22 18:03:39.256433675 +0000 UTC m=+647.565065162" watchObservedRunningTime="2026-04-22 18:03:39.258505396 +0000 UTC m=+647.567136861" Apr 22 18:03:39.282220 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.282186 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:39.282523 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.282479 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" podUID="3fbe73be-865f-47bd-afff-0ed09bb71af8" containerName="authorino" containerID="cri-o://103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467" gracePeriod=30 Apr 22 18:03:39.524603 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.524571 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:39.624365 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.624335 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt6f2\" (UniqueName: \"kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2\") pod \"3fbe73be-865f-47bd-afff-0ed09bb71af8\" (UID: \"3fbe73be-865f-47bd-afff-0ed09bb71af8\") " Apr 22 18:03:39.626509 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.626478 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2" (OuterVolumeSpecName: "kube-api-access-gt6f2") pod "3fbe73be-865f-47bd-afff-0ed09bb71af8" (UID: "3fbe73be-865f-47bd-afff-0ed09bb71af8"). InnerVolumeSpecName "kube-api-access-gt6f2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:39.724948 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:39.724912 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gt6f2\" (UniqueName: \"kubernetes.io/projected/3fbe73be-865f-47bd-afff-0ed09bb71af8-kube-api-access-gt6f2\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:03:40.246644 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.246610 2568 generic.go:358] "Generic (PLEG): container finished" podID="3fbe73be-865f-47bd-afff-0ed09bb71af8" containerID="103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467" exitCode=0 Apr 22 18:03:40.247062 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.246652 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" event={"ID":"3fbe73be-865f-47bd-afff-0ed09bb71af8","Type":"ContainerDied","Data":"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467"} Apr 22 18:03:40.247062 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.246688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" event={"ID":"3fbe73be-865f-47bd-afff-0ed09bb71af8","Type":"ContainerDied","Data":"e034dedb7d5c1b3d97d6ad6ec3367afc52d8b106e2d581269a75c884155dab87"} Apr 22 18:03:40.247062 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.246704 2568 scope.go:117] "RemoveContainer" containerID="103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467" Apr 22 18:03:40.247062 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.246665 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-q2ztd" Apr 22 18:03:40.254455 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.254434 2568 scope.go:117] "RemoveContainer" containerID="103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467" Apr 22 18:03:40.254709 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:03:40.254689 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467\": container with ID starting with 103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467 not found: ID does not exist" containerID="103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467" Apr 22 18:03:40.254843 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.254722 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467"} err="failed to get container status \"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467\": rpc error: code = NotFound desc = could not find container \"103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467\": container with ID starting with 103808abc2b734ae952234890f42d81504bd7f3167aadfaad0c435a9fc28c467 not found: ID does not exist" Apr 22 18:03:40.262861 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.262839 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:40.268972 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:40.268940 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-q2ztd"] Apr 22 18:03:42.213335 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:03:42.213305 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbe73be-865f-47bd-afff-0ed09bb71af8" path="/var/lib/kubelet/pods/3fbe73be-865f-47bd-afff-0ed09bb71af8/volumes" Apr 22 18:06:21.384336 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.384257 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:06:21.384833 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.384581 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fbe73be-865f-47bd-afff-0ed09bb71af8" containerName="authorino" Apr 22 18:06:21.384833 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.384594 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbe73be-865f-47bd-afff-0ed09bb71af8" containerName="authorino" Apr 22 18:06:21.384833 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.384656 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fbe73be-865f-47bd-afff-0ed09bb71af8" containerName="authorino" Apr 22 18:06:21.387478 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.387458 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.390205 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.390182 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:06:21.390312 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.390206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:06:21.390449 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.390437 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:06:21.391259 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.391240 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 18:06:21.398311 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.398289 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:06:21.531352 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.531352 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.531569 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.531569 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531453 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhlq\" (UniqueName: \"kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.531569 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.531569 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.531544 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.631996 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.631950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632067 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632094 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhlq\" (UniqueName: \"kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632462 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632409 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632462 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.632550 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.632472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.634307 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.634289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.634670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.634620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.639970 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.639948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhlq\" (UniqueName: \"kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.698015 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.697979 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:21.819491 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:21.819465 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:06:21.822064 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:06:21.822034 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00e81c93_f756_445f_bc3d_4aea89d90d72.slice/crio-cc3be4040b614b36c918e09c5d86073c174d0a2162636bbfcf32e502ebcec050 WatchSource:0}: Error finding container cc3be4040b614b36c918e09c5d86073c174d0a2162636bbfcf32e502ebcec050: Status 404 returned error can't find the container with id cc3be4040b614b36c918e09c5d86073c174d0a2162636bbfcf32e502ebcec050 Apr 22 18:06:22.753657 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:22.753616 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerStarted","Data":"cc3be4040b614b36c918e09c5d86073c174d0a2162636bbfcf32e502ebcec050"} Apr 22 18:06:25.769227 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:25.769189 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerStarted","Data":"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8"} Apr 22 18:06:29.785005 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:29.784920 2568 generic.go:358] "Generic (PLEG): container finished" podID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerID="0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8" exitCode=0 Apr 22 18:06:29.785005 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:29.784996 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerDied","Data":"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8"} Apr 22 18:06:31.793770 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:31.793709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerStarted","Data":"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03"} Apr 22 18:06:31.813300 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:31.813253 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" podStartSLOduration=1.785725995 podStartE2EDuration="10.813238517s" podCreationTimestamp="2026-04-22 18:06:21 +0000 UTC" firstStartedPulling="2026-04-22 18:06:21.823846936 +0000 UTC m=+810.132478401" lastFinishedPulling="2026-04-22 18:06:30.851359455 +0000 UTC m=+819.159990923" observedRunningTime="2026-04-22 18:06:31.812358274 +0000 UTC m=+820.120989772" watchObservedRunningTime="2026-04-22 18:06:31.813238517 +0000 UTC m=+820.121870021" Apr 22 18:06:41.698231 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:41.698188 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:41.698626 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:41.698233 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:41.710822 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:41.710789 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:06:41.838033 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:06:41.838003 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:07:43.112535 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.112458 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:07:43.113045 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.112770 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="main" containerID="cri-o://86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03" gracePeriod=30 Apr 22 18:07:43.351331 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.351307 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:07:43.497487 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497453 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfhlq\" (UniqueName: \"kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497506 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497533 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497566 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497612 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497670 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497640 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs\") pod \"00e81c93-f756-445f-bc3d-4aea89d90d72\" (UID: \"00e81c93-f756-445f-bc3d-4aea89d90d72\") " Apr 22 18:07:43.497980 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497919 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home" (OuterVolumeSpecName: "home") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:43.498034 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.497980 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache" (OuterVolumeSpecName: "model-cache") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:43.499886 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.499862 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:07:43.500117 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.500100 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm" (OuterVolumeSpecName: "dshm") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:43.500179 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.500132 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq" (OuterVolumeSpecName: "kube-api-access-wfhlq") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "kube-api-access-wfhlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:07:43.550667 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.550626 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "00e81c93-f756-445f-bc3d-4aea89d90d72" (UID: "00e81c93-f756-445f-bc3d-4aea89d90d72"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:07:43.599248 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599215 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:43.599248 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599244 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:43.599248 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599255 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:43.599454 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599265 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00e81c93-f756-445f-bc3d-4aea89d90d72-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:43.599454 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599274 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00e81c93-f756-445f-bc3d-4aea89d90d72-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:43.599454 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:43.599282 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfhlq\" (UniqueName: \"kubernetes.io/projected/00e81c93-f756-445f-bc3d-4aea89d90d72-kube-api-access-wfhlq\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:07:44.025688 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.025654 2568 generic.go:358] "Generic (PLEG): container finished" podID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerID="86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03" exitCode=0 Apr 22 18:07:44.025884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.025743 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerDied","Data":"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03"} Apr 22 18:07:44.025884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.025750 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" Apr 22 18:07:44.025884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.025771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx" event={"ID":"00e81c93-f756-445f-bc3d-4aea89d90d72","Type":"ContainerDied","Data":"cc3be4040b614b36c918e09c5d86073c174d0a2162636bbfcf32e502ebcec050"} Apr 22 18:07:44.025884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.025786 2568 scope.go:117] "RemoveContainer" containerID="86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03" Apr 22 18:07:44.034566 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.034549 2568 scope.go:117] "RemoveContainer" containerID="0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8" Apr 22 18:07:44.049530 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.049501 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:07:44.052829 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.052804 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-d64fdcfc-7cznx"] Apr 22 18:07:44.060314 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.060296 2568 scope.go:117] "RemoveContainer" containerID="86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03" Apr 22 18:07:44.060634 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:07:44.060616 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03\": container with ID starting with 86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03 not found: ID does not exist" containerID="86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03" Apr 22 18:07:44.060686 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.060642 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03"} err="failed to get container status \"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03\": rpc error: code = NotFound desc = could not find container \"86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03\": container with ID starting with 86bda9e02e48c59177086b2f51e4c06a418eb5bdd52fa2d9c6b6a45102354c03 not found: ID does not exist" Apr 22 18:07:44.060686 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.060661 2568 scope.go:117] "RemoveContainer" containerID="0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8" Apr 22 18:07:44.060922 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:07:44.060905 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8\": container with ID starting with 0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8 not found: ID does not exist" containerID="0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8" Apr 22 18:07:44.060968 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.060927 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8"} err="failed to get container status \"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8\": rpc error: code = NotFound desc = could not find container \"0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8\": container with ID starting with 0ee394b0253557014b198ae31a8dc2efee52e39bec14e10c6f7613a91a99dad8 not found: ID does not exist" Apr 22 18:07:44.213332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:44.213298 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" path="/var/lib/kubelet/pods/00e81c93-f756-445f-bc3d-4aea89d90d72/volumes" Apr 22 18:07:52.163521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:52.163490 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:07:52.165889 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:07:52.165867 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:08:02.687387 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687354 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:08:02.687781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687684 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="storage-initializer" Apr 22 18:08:02.687781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687695 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="storage-initializer" Apr 22 18:08:02.687781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687704 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="main" Apr 22 18:08:02.687781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687711 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="main" Apr 22 18:08:02.687915 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.687785 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="00e81c93-f756-445f-bc3d-4aea89d90d72" containerName="main" Apr 22 18:08:02.692098 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.692080 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.695041 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.695017 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:08:02.695041 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.695017 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:08:02.695207 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.695060 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:08:02.695207 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.695118 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 18:08:02.701354 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.701331 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:08:02.751266 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751234 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxnsl\" (UniqueName: \"kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.751427 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.751427 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.751427 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.751427 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751348 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.751427 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.751425 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852663 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxnsl\" (UniqueName: \"kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852753 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852780 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.852846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.852809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.853132 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.853114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.853205 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.853182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.853267 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.853205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.854950 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.854923 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.855193 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.855178 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:02.860345 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:02.860324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxnsl\" (UniqueName: \"kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:03.001965 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:03.001925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:03.123614 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:03.123409 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:08:03.127257 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:08:03.127217 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5f7652b_611f_4a57_b942_f12fe44a9a3e.slice/crio-aea46906ae2de214e0677a893d770c104822a65d9c66667c2b3974524c705d35 WatchSource:0}: Error finding container aea46906ae2de214e0677a893d770c104822a65d9c66667c2b3974524c705d35: Status 404 returned error can't find the container with id aea46906ae2de214e0677a893d770c104822a65d9c66667c2b3974524c705d35 Apr 22 18:08:03.128940 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:03.128922 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:08:04.092564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:04.092524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerStarted","Data":"1304fd8b893270d169a2e51ba7fa84104f8796e1b8a4ef1d764e5e31ba79ecd0"} Apr 22 18:08:04.092564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:04.092564 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerStarted","Data":"aea46906ae2de214e0677a893d770c104822a65d9c66667c2b3974524c705d35"} Apr 22 18:08:08.108198 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:08.108163 2568 generic.go:358] "Generic (PLEG): container finished" podID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerID="1304fd8b893270d169a2e51ba7fa84104f8796e1b8a4ef1d764e5e31ba79ecd0" exitCode=0 Apr 22 18:08:08.108564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:08.108208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerDied","Data":"1304fd8b893270d169a2e51ba7fa84104f8796e1b8a4ef1d764e5e31ba79ecd0"} Apr 22 18:08:17.112343 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.112307 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:17.170621 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.170057 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:17.170621 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.170218 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.172932 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.172905 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 18:08:17.285544 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.285758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.285758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285634 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.285758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.285944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285761 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p65bq\" (UniqueName: \"kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.285944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.285815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386398 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386398 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386639 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386437 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386639 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386639 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p65bq\" (UniqueName: \"kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.386639 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.387202 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.386930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.387202 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.387069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.387202 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.387154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.389305 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.389275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.389978 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.389953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.400151 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.400129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p65bq\" (UniqueName: \"kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq\") pod \"precise-prefix-cache-test-kserve-549d7668fc-tjgnx\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:17.485140 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:17.485093 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:22.899169 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:22.899133 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:22.902160 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:08:22.902128 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf3d01c8_1fc5_4274_91f1_e60446f659fa.slice/crio-378664771cab669ff508f57710dd082ca641fad3623f9656dffa8c38c2a24c3a WatchSource:0}: Error finding container 378664771cab669ff508f57710dd082ca641fad3623f9656dffa8c38c2a24c3a: Status 404 returned error can't find the container with id 378664771cab669ff508f57710dd082ca641fad3623f9656dffa8c38c2a24c3a Apr 22 18:08:23.180808 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:23.180769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerStarted","Data":"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459"} Apr 22 18:08:23.180808 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:23.180809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerStarted","Data":"378664771cab669ff508f57710dd082ca641fad3623f9656dffa8c38c2a24c3a"} Apr 22 18:08:39.238587 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:39.238552 2568 generic.go:358] "Generic (PLEG): container finished" podID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerID="eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459" exitCode=0 Apr 22 18:08:39.239058 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:39.238639 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerDied","Data":"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459"} Apr 22 18:08:40.243416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:40.243376 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerStarted","Data":"7f4cf102502530cefea065aa34c2ac36ef63fb9eed12eebe515ffc10c3b3c41b"} Apr 22 18:08:40.245434 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:40.245405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerStarted","Data":"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f"} Apr 22 18:08:40.264979 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:40.264882 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podStartSLOduration=6.338982131 podStartE2EDuration="38.264865117s" podCreationTimestamp="2026-04-22 18:08:02 +0000 UTC" firstStartedPulling="2026-04-22 18:08:08.109383677 +0000 UTC m=+916.418015145" lastFinishedPulling="2026-04-22 18:08:40.035266654 +0000 UTC m=+948.343898131" observedRunningTime="2026-04-22 18:08:40.263383551 +0000 UTC m=+948.572015038" watchObservedRunningTime="2026-04-22 18:08:40.264865117 +0000 UTC m=+948.573496603" Apr 22 18:08:40.281791 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:40.281723 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" podStartSLOduration=23.281707198 podStartE2EDuration="23.281707198s" podCreationTimestamp="2026-04-22 18:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:08:40.281120166 +0000 UTC m=+948.589751652" watchObservedRunningTime="2026-04-22 18:08:40.281707198 +0000 UTC m=+948.590338684" Apr 22 18:08:43.002723 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:43.002686 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:43.002723 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:43.002747 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:08:43.004484 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:43.004451 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:08:47.485961 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:47.485926 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:47.486423 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:47.485973 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:47.498799 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:47.498761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:48.283094 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:48.283065 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:50.964943 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:50.964909 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:51.281544 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.281452 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="main" containerID="cri-o://c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f" gracePeriod=30 Apr 22 18:08:51.528656 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.528633 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:51.603378 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603291 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603378 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603359 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603413 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603439 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p65bq\" (UniqueName: \"kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603481 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603510 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location\") pod \"af3d01c8-1fc5-4274-91f1-e60446f659fa\" (UID: \"af3d01c8-1fc5-4274-91f1-e60446f659fa\") " Apr 22 18:08:51.603913 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603852 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home" (OuterVolumeSpecName: "home") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.604014 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.603922 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache" (OuterVolumeSpecName: "model-cache") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.605559 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.605507 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm" (OuterVolumeSpecName: "dshm") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.605672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.605602 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq" (OuterVolumeSpecName: "kube-api-access-p65bq") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "kube-api-access-p65bq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:08:51.605824 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.605797 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:08:51.656564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.656524 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "af3d01c8-1fc5-4274-91f1-e60446f659fa" (UID: "af3d01c8-1fc5-4274-91f1-e60446f659fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:08:51.704805 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704763 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/af3d01c8-1fc5-4274-91f1-e60446f659fa-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.704805 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704798 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.704805 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704807 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.705017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704816 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p65bq\" (UniqueName: \"kubernetes.io/projected/af3d01c8-1fc5-4274-91f1-e60446f659fa-kube-api-access-p65bq\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.705017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704827 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:51.705017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:51.704836 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af3d01c8-1fc5-4274-91f1-e60446f659fa-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:08:52.286052 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.286015 2568 generic.go:358] "Generic (PLEG): container finished" podID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerID="c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f" exitCode=0 Apr 22 18:08:52.286460 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.286072 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerDied","Data":"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f"} Apr 22 18:08:52.286460 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.286088 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" Apr 22 18:08:52.286460 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.286106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx" event={"ID":"af3d01c8-1fc5-4274-91f1-e60446f659fa","Type":"ContainerDied","Data":"378664771cab669ff508f57710dd082ca641fad3623f9656dffa8c38c2a24c3a"} Apr 22 18:08:52.286460 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.286127 2568 scope.go:117] "RemoveContainer" containerID="c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f" Apr 22 18:08:52.293940 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.293920 2568 scope.go:117] "RemoveContainer" containerID="eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459" Apr 22 18:08:52.305798 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.305757 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:52.307250 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.307230 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-549d7668fc-tjgnx"] Apr 22 18:08:52.372598 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.372574 2568 scope.go:117] "RemoveContainer" containerID="c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f" Apr 22 18:08:52.372917 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:08:52.372895 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f\": container with ID starting with c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f not found: ID does not exist" containerID="c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f" Apr 22 18:08:52.372986 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.372928 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f"} err="failed to get container status \"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f\": rpc error: code = NotFound desc = could not find container \"c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f\": container with ID starting with c4540cc9f56440344fe1a085c3bfc2d0ac4ae11085364c3cae2bf8a8b8491d5f not found: ID does not exist" Apr 22 18:08:52.372986 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.372950 2568 scope.go:117] "RemoveContainer" containerID="eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459" Apr 22 18:08:52.373205 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:08:52.373182 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459\": container with ID starting with eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459 not found: ID does not exist" containerID="eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459" Apr 22 18:08:52.373257 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:52.373212 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459"} err="failed to get container status \"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459\": rpc error: code = NotFound desc = could not find container \"eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459\": container with ID starting with eb07e2102eb66b56e08cb118c318284c46f2dd47f731567590405b046c3c9459 not found: ID does not exist" Apr 22 18:08:53.003003 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:53.002953 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:08:54.212944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:08:54.212912 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" path="/var/lib/kubelet/pods/af3d01c8-1fc5-4274-91f1-e60446f659fa/volumes" Apr 22 18:09:02.594218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594183 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:09:02.594664 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594646 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="main" Apr 22 18:09:02.594719 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594668 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="main" Apr 22 18:09:02.594719 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594695 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="storage-initializer" Apr 22 18:09:02.594719 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594705 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="storage-initializer" Apr 22 18:09:02.594841 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.594794 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="af3d01c8-1fc5-4274-91f1-e60446f659fa" containerName="main" Apr 22 18:09:02.845690 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.845606 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:09:02.845859 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.845759 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.850594 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.850564 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 18:09:02.900034 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.899985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlgkq\" (UniqueName: \"kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.900210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.900056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.900210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.900094 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.900210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.900130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.900330 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.900221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:02.900330 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:02.900255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001114 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001114 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001378 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001304 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlgkq\" (UniqueName: \"kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001378 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001488 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001488 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001637 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001771 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001752 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.001846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.001784 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.002342 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.002311 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:03.003535 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.003511 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.003903 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.003857 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.009306 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.009277 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlgkq\" (UniqueName: \"kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq\") pod \"stop-feature-test-kserve-5d79b9f6dd-rdmnk\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.158472 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.158384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:03.291254 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.291208 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:09:03.293330 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:09:03.293302 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24840b4a_deb3_4146_8a89_7a20ef93010a.slice/crio-8a1883b3942f9d45c13cbff1dbbc28e55c0b5878c695070e5dd256efe689565b WatchSource:0}: Error finding container 8a1883b3942f9d45c13cbff1dbbc28e55c0b5878c695070e5dd256efe689565b: Status 404 returned error can't find the container with id 8a1883b3942f9d45c13cbff1dbbc28e55c0b5878c695070e5dd256efe689565b Apr 22 18:09:03.323280 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:03.323238 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerStarted","Data":"8a1883b3942f9d45c13cbff1dbbc28e55c0b5878c695070e5dd256efe689565b"} Apr 22 18:09:04.327844 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:04.327807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerStarted","Data":"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618"} Apr 22 18:09:08.342921 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:08.342832 2568 generic.go:358] "Generic (PLEG): container finished" podID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerID="3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618" exitCode=0 Apr 22 18:09:08.342921 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:08.342900 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerDied","Data":"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618"} Apr 22 18:09:09.349697 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:09.349655 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerStarted","Data":"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df"} Apr 22 18:09:09.369917 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:09.369862 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podStartSLOduration=7.369847016 podStartE2EDuration="7.369847016s" podCreationTimestamp="2026-04-22 18:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:09:09.368405054 +0000 UTC m=+977.677036541" watchObservedRunningTime="2026-04-22 18:09:09.369847016 +0000 UTC m=+977.678478502" Apr 22 18:09:13.002941 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:13.002849 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:13.159404 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:13.159356 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:13.159559 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:13.159439 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:09:13.161027 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:13.160999 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:09:23.002941 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:23.002895 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:23.159558 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:23.159507 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:09:33.002916 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:33.002861 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:33.159835 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:33.159784 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:09:43.003394 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:43.003334 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:43.159680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:43.159623 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:09:53.002853 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:53.002789 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:09:53.159958 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:09:53.159912 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:03.002940 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:03.002896 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:10:03.159614 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:03.159569 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:13.002595 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:13.002548 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:10:13.159214 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:13.159159 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:23.003415 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:23.003358 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 22 18:10:23.159374 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:23.159325 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:33.012267 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:33.012237 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:10:33.020676 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:33.020643 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:10:33.159738 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:33.159684 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:42.620843 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:42.620754 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:10:42.621800 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:42.621766 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" containerID="cri-o://7f4cf102502530cefea065aa34c2ac36ef63fb9eed12eebe515ffc10c3b3c41b" gracePeriod=30 Apr 22 18:10:43.158896 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:43.158851 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:49.600626 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.600590 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:10:49.607712 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.607679 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.610789 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.610762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 18:10:49.612420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.612394 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:10:49.737473 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.737656 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737500 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.737656 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.737656 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.737656 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737624 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.737828 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.737699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpvbd\" (UniqueName: \"kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.838683 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.838641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.838875 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.838695 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.838875 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.838741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.838875 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.838785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.839043 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.838947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpvbd\" (UniqueName: \"kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.839099 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.839041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.839151 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.839042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.839151 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.839127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.839286 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.839254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.840919 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.840892 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.841417 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.841395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.848355 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.848327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpvbd\" (UniqueName: \"kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd\") pod \"custom-route-timeout-test-kserve-6769776b64-gdzkp\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:49.920402 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:49.920312 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:50.047336 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:50.047308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:10:50.049483 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:10:50.049449 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7410c3b_b346_44e9_8f5a_649c74970dc7.slice/crio-c36b9497b31e05907646ebd042f3b08cdf017f981ea0c4fa1fad81282e3f6bd2 WatchSource:0}: Error finding container c36b9497b31e05907646ebd042f3b08cdf017f981ea0c4fa1fad81282e3f6bd2: Status 404 returned error can't find the container with id c36b9497b31e05907646ebd042f3b08cdf017f981ea0c4fa1fad81282e3f6bd2 Apr 22 18:10:50.704086 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:50.704049 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerStarted","Data":"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f"} Apr 22 18:10:50.704086 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:50.704085 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerStarted","Data":"c36b9497b31e05907646ebd042f3b08cdf017f981ea0c4fa1fad81282e3f6bd2"} Apr 22 18:10:53.159413 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:53.159371 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 22 18:10:54.720154 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:54.720121 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerID="be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f" exitCode=0 Apr 22 18:10:54.720521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:54.720173 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerDied","Data":"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f"} Apr 22 18:10:55.727136 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:55.727096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerStarted","Data":"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2"} Apr 22 18:10:55.748257 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:55.748200 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podStartSLOduration=6.748180939 podStartE2EDuration="6.748180939s" podCreationTimestamp="2026-04-22 18:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:10:55.74797628 +0000 UTC m=+1084.056607773" watchObservedRunningTime="2026-04-22 18:10:55.748180939 +0000 UTC m=+1084.056812437" Apr 22 18:10:59.921295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:59.921259 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:59.921295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:59.921303 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:10:59.922891 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:10:59.922860 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:03.169130 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:03.169093 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:11:03.177053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:03.177025 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:11:06.899326 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:06.899292 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:11:06.899842 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:06.899634 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" containerID="cri-o://c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df" gracePeriod=30 Apr 22 18:11:09.921556 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:09.921504 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:12.788075 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.787878 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm_d5f7652b-611f-4a57-b942-f12fe44a9a3e/main/0.log" Apr 22 18:11:12.788386 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.788300 2568 generic.go:358] "Generic (PLEG): container finished" podID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerID="7f4cf102502530cefea065aa34c2ac36ef63fb9eed12eebe515ffc10c3b3c41b" exitCode=137 Apr 22 18:11:12.788386 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.788370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerDied","Data":"7f4cf102502530cefea065aa34c2ac36ef63fb9eed12eebe515ffc10c3b3c41b"} Apr 22 18:11:12.868476 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.868446 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm_d5f7652b-611f-4a57-b942-f12fe44a9a3e/main/0.log" Apr 22 18:11:12.868868 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.868853 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:11:12.946044 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946018 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxnsl\" (UniqueName: \"kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946072 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946134 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946162 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946193 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946431 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946223 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home\") pod \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\" (UID: \"d5f7652b-611f-4a57-b942-f12fe44a9a3e\") " Apr 22 18:11:12.946597 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946527 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache" (OuterVolumeSpecName: "model-cache") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:12.946804 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.946778 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home" (OuterVolumeSpecName: "home") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:12.948229 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.948202 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl" (OuterVolumeSpecName: "kube-api-access-pxnsl") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "kube-api-access-pxnsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:11:12.948319 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.948230 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm" (OuterVolumeSpecName: "dshm") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:12.948472 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:12.948457 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:11:13.000703 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.000656 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5f7652b-611f-4a57-b942-f12fe44a9a3e" (UID: "d5f7652b-611f-4a57-b942-f12fe44a9a3e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:13.047158 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047124 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.047158 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047152 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d5f7652b-611f-4a57-b942-f12fe44a9a3e-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.047158 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047162 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.047379 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047170 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.047379 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047178 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d5f7652b-611f-4a57-b942-f12fe44a9a3e-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.047379 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.047186 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxnsl\" (UniqueName: \"kubernetes.io/projected/d5f7652b-611f-4a57-b942-f12fe44a9a3e-kube-api-access-pxnsl\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:13.794390 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.794358 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm_d5f7652b-611f-4a57-b942-f12fe44a9a3e/main/0.log" Apr 22 18:11:13.794863 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.794830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" event={"ID":"d5f7652b-611f-4a57-b942-f12fe44a9a3e","Type":"ContainerDied","Data":"aea46906ae2de214e0677a893d770c104822a65d9c66667c2b3974524c705d35"} Apr 22 18:11:13.794915 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.794869 2568 scope.go:117] "RemoveContainer" containerID="7f4cf102502530cefea065aa34c2ac36ef63fb9eed12eebe515ffc10c3b3c41b" Apr 22 18:11:13.794915 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.794903 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm" Apr 22 18:11:13.816389 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.816367 2568 scope.go:117] "RemoveContainer" containerID="1304fd8b893270d169a2e51ba7fa84104f8796e1b8a4ef1d764e5e31ba79ecd0" Apr 22 18:11:13.819664 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.819640 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:11:13.824623 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:13.824596 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-69859ff4946kjqm"] Apr 22 18:11:14.217814 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:14.217781 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" path="/var/lib/kubelet/pods/d5f7652b-611f-4a57-b942-f12fe44a9a3e/volumes" Apr 22 18:11:17.400429 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.400398 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:11:17.400929 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.400908 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" Apr 22 18:11:17.400929 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.400930 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" Apr 22 18:11:17.401077 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.400948 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="storage-initializer" Apr 22 18:11:17.401077 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.400958 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="storage-initializer" Apr 22 18:11:17.401077 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.401042 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5f7652b-611f-4a57-b942-f12fe44a9a3e" containerName="main" Apr 22 18:11:17.404450 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.404429 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.415678 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.415655 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:11:17.484096 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484055 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8tcs\" (UniqueName: \"kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.484296 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484112 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.484296 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484140 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.484296 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.484450 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.484450 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.484336 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.585643 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.585600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8tcs\" (UniqueName: \"kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.585878 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.585679 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.585878 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.585723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.585878 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.585807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.586045 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.585940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.586045 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.586008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.586155 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.586117 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.586155 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.586146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.586306 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.586287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.588112 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.588087 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.588394 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.588372 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.592804 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.592783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8tcs\" (UniqueName: \"kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs\") pod \"stop-feature-test-kserve-5d79b9f6dd-ft2r6\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.717012 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.716977 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:17.843105 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:17.843080 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:11:17.845513 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:11:17.845482 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod169f435d_ab79_4ff4_b63f_8dbf40a8e709.slice/crio-b6fcddb31bb18bde0e2f6e97ede100a5f7f9b9100c1dc6466ea7f303a1891eff WatchSource:0}: Error finding container b6fcddb31bb18bde0e2f6e97ede100a5f7f9b9100c1dc6466ea7f303a1891eff: Status 404 returned error can't find the container with id b6fcddb31bb18bde0e2f6e97ede100a5f7f9b9100c1dc6466ea7f303a1891eff Apr 22 18:11:18.814008 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:18.813963 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerStarted","Data":"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b"} Apr 22 18:11:18.814008 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:18.814014 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerStarted","Data":"b6fcddb31bb18bde0e2f6e97ede100a5f7f9b9100c1dc6466ea7f303a1891eff"} Apr 22 18:11:19.921740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:19.921690 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:22.828560 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:22.828527 2568 generic.go:358] "Generic (PLEG): container finished" podID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerID="12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b" exitCode=0 Apr 22 18:11:22.828955 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:22.828597 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerDied","Data":"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b"} Apr 22 18:11:23.833352 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:23.833312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerStarted","Data":"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad"} Apr 22 18:11:23.853686 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:23.853637 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podStartSLOduration=6.853623357 podStartE2EDuration="6.853623357s" podCreationTimestamp="2026-04-22 18:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:11:23.852204071 +0000 UTC m=+1112.160835559" watchObservedRunningTime="2026-04-22 18:11:23.853623357 +0000 UTC m=+1112.162254843" Apr 22 18:11:27.717750 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:27.717691 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:27.717750 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:27.717755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:11:27.719279 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:27.719246 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:11:29.921835 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:29.921782 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:37.178681 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.178648 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5d79b9f6dd-rdmnk_24840b4a-deb3-4146-8a89-7a20ef93010a/main/0.log" Apr 22 18:11:37.179153 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.179133 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:11:37.268928 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.268887 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.268928 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.268933 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.269186 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.268955 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.269186 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.268983 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.269186 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.269022 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlgkq\" (UniqueName: \"kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.269186 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.269054 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location\") pod \"24840b4a-deb3-4146-8a89-7a20ef93010a\" (UID: \"24840b4a-deb3-4146-8a89-7a20ef93010a\") " Apr 22 18:11:37.269394 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.269246 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache" (OuterVolumeSpecName: "model-cache") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:37.269394 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.269385 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.269521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.269496 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home" (OuterVolumeSpecName: "home") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:37.271397 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.271367 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:11:37.271682 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.271653 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm" (OuterVolumeSpecName: "dshm") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:37.271816 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.271784 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq" (OuterVolumeSpecName: "kube-api-access-nlgkq") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "kube-api-access-nlgkq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:11:37.338704 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.338649 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24840b4a-deb3-4146-8a89-7a20ef93010a" (UID: "24840b4a-deb3-4146-8a89-7a20ef93010a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:11:37.371174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.371134 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.371174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.371163 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24840b4a-deb3-4146-8a89-7a20ef93010a-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.371174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.371176 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.371407 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.371190 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlgkq\" (UniqueName: \"kubernetes.io/projected/24840b4a-deb3-4146-8a89-7a20ef93010a-kube-api-access-nlgkq\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.371407 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.371213 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24840b4a-deb3-4146-8a89-7a20ef93010a-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:11:37.717949 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.717890 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:11:37.892069 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892028 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5d79b9f6dd-rdmnk_24840b4a-deb3-4146-8a89-7a20ef93010a/main/0.log" Apr 22 18:11:37.892422 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892397 2568 generic.go:358] "Generic (PLEG): container finished" podID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerID="c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df" exitCode=137 Apr 22 18:11:37.892576 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892488 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" Apr 22 18:11:37.892576 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerDied","Data":"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df"} Apr 22 18:11:37.892709 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892602 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk" event={"ID":"24840b4a-deb3-4146-8a89-7a20ef93010a","Type":"ContainerDied","Data":"8a1883b3942f9d45c13cbff1dbbc28e55c0b5878c695070e5dd256efe689565b"} Apr 22 18:11:37.892709 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.892619 2568 scope.go:117] "RemoveContainer" containerID="c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df" Apr 22 18:11:37.911623 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.911383 2568 scope.go:117] "RemoveContainer" containerID="3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618" Apr 22 18:11:37.919352 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.919318 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:11:37.922632 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.922606 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-rdmnk"] Apr 22 18:11:37.923360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.923339 2568 scope.go:117] "RemoveContainer" containerID="c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df" Apr 22 18:11:37.923737 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:11:37.923694 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df\": container with ID starting with c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df not found: ID does not exist" containerID="c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df" Apr 22 18:11:37.923834 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.923749 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df"} err="failed to get container status \"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df\": rpc error: code = NotFound desc = could not find container \"c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df\": container with ID starting with c1bd3d566310c347aeeb500450c19680ed26ef00382e8228e253a1d1584669df not found: ID does not exist" Apr 22 18:11:37.923834 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.923774 2568 scope.go:117] "RemoveContainer" containerID="3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618" Apr 22 18:11:37.924076 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:11:37.924060 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618\": container with ID starting with 3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618 not found: ID does not exist" containerID="3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618" Apr 22 18:11:37.924134 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:37.924079 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618"} err="failed to get container status \"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618\": rpc error: code = NotFound desc = could not find container \"3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618\": container with ID starting with 3037b3868a0dc8689b6b69fc9a5d9b7244bb9ea9615dbfb2cd6ef2367b50e618 not found: ID does not exist" Apr 22 18:11:38.213019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:38.212983 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" path="/var/lib/kubelet/pods/24840b4a-deb3-4146-8a89-7a20ef93010a/volumes" Apr 22 18:11:39.921043 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:39.920989 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:47.717846 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:47.717792 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:11:49.921766 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:49.921712 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:11:57.718313 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:57.718258 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:11:59.921781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:11:59.921737 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:12:07.717658 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:07.717604 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:09.921278 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:09.921235 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:12:17.718257 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:17.718161 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:19.921058 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:19.921005 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:12:27.717638 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:27.717587 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:29.921229 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:29.921184 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:12:37.718415 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:37.718372 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:39.930639 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:39.930580 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:12:47.718198 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:47.718152 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:49.931449 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:49.931419 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:12:49.939018 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:49.938990 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:12:52.185744 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:52.185696 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:12:52.188423 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:52.188397 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:12:57.717870 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:57.717804 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:12:59.815459 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:59.815423 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:12:59.815937 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:12:59.815805 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" containerID="cri-o://3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2" gracePeriod=30 Apr 22 18:13:07.718336 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:07.718286 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.33:8000/health\": dial tcp 10.132.0.33:8000: connect: connection refused" Apr 22 18:13:17.727704 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:17.727668 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:13:17.735597 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:17.735575 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:13:20.308521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.308477 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:13:20.309229 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.309206 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" Apr 22 18:13:20.309229 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.309230 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" Apr 22 18:13:20.309395 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.309244 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="storage-initializer" Apr 22 18:13:20.309395 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.309253 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="storage-initializer" Apr 22 18:13:20.309493 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.309401 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="24840b4a-deb3-4146-8a89-7a20ef93010a" containerName="main" Apr 22 18:13:20.312522 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.312495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.315742 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.315705 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 18:13:20.322175 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.322149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:13:20.441807 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.441984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.441984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gcq\" (UniqueName: \"kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.441984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.441984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441920 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.441984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.441939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gcq\" (UniqueName: \"kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543243 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543281 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543673 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543752 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.543809 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.543724 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.545528 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.545507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.545717 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.545700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.550868 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.550847 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gcq\" (UniqueName: \"kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq\") pod \"router-with-refs-test-kserve-f99c8d868-gtdgr\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.625104 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.625019 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:20.751297 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.751275 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:13:20.753292 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:13:20.753258 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc20bef54_af40_4a86_99b1_e786c054d103.slice/crio-2dcdf2765147d39fa399796ccab6f73bde6d960169136839676a612d7891431a WatchSource:0}: Error finding container 2dcdf2765147d39fa399796ccab6f73bde6d960169136839676a612d7891431a: Status 404 returned error can't find the container with id 2dcdf2765147d39fa399796ccab6f73bde6d960169136839676a612d7891431a Apr 22 18:13:20.755071 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:20.755055 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:13:21.247834 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:21.247796 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerStarted","Data":"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e"} Apr 22 18:13:21.247834 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:21.247837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerStarted","Data":"2dcdf2765147d39fa399796ccab6f73bde6d960169136839676a612d7891431a"} Apr 22 18:13:26.265498 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:26.265463 2568 generic.go:358] "Generic (PLEG): container finished" podID="c20bef54-af40-4a86-99b1-e786c054d103" containerID="41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e" exitCode=0 Apr 22 18:13:26.265991 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:26.265542 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerDied","Data":"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e"} Apr 22 18:13:27.270038 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:27.269993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerStarted","Data":"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59"} Apr 22 18:13:27.290953 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:27.290895 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podStartSLOduration=7.290876763 podStartE2EDuration="7.290876763s" podCreationTimestamp="2026-04-22 18:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:13:27.288997347 +0000 UTC m=+1235.597628835" watchObservedRunningTime="2026-04-22 18:13:27.290876763 +0000 UTC m=+1235.599508249" Apr 22 18:13:29.932411 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:29.932366 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" probeResult="failure" output="Get \"https://10.132.0.32:8000/health\": dial tcp 10.132.0.32:8000: connect: connection refused" Apr 22 18:13:30.061902 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.061872 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6769776b64-gdzkp_d7410c3b-b346-44e9-8f5a-649c74970dc7/main/0.log" Apr 22 18:13:30.062251 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.062234 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:13:30.132746 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132654 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132746 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132714 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132987 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132808 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpvbd\" (UniqueName: \"kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132987 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132853 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132987 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132888 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132987 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm\") pod \"d7410c3b-b346-44e9-8f5a-649c74970dc7\" (UID: \"d7410c3b-b346-44e9-8f5a-649c74970dc7\") " Apr 22 18:13:30.132987 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.132928 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache" (OuterVolumeSpecName: "model-cache") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:30.133304 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.133281 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.133369 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.133277 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home" (OuterVolumeSpecName: "home") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:30.134999 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.134968 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm" (OuterVolumeSpecName: "dshm") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:30.135123 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.135022 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd" (OuterVolumeSpecName: "kube-api-access-bpvbd") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "kube-api-access-bpvbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:13:30.135335 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.135310 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:13:30.187370 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.187324 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d7410c3b-b346-44e9-8f5a-649c74970dc7" (UID: "d7410c3b-b346-44e9-8f5a-649c74970dc7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:13:30.234387 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.234359 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.234387 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.234386 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpvbd\" (UniqueName: \"kubernetes.io/projected/d7410c3b-b346-44e9-8f5a-649c74970dc7-kube-api-access-bpvbd\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.234564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.234396 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.234564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.234405 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d7410c3b-b346-44e9-8f5a-649c74970dc7-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.234564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.234416 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d7410c3b-b346-44e9-8f5a-649c74970dc7-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:13:30.285884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.285856 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-6769776b64-gdzkp_d7410c3b-b346-44e9-8f5a-649c74970dc7/main/0.log" Apr 22 18:13:30.286279 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.286253 2568 generic.go:358] "Generic (PLEG): container finished" podID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerID="3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2" exitCode=137 Apr 22 18:13:30.286343 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.286329 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" Apr 22 18:13:30.286421 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.286327 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerDied","Data":"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2"} Apr 22 18:13:30.286457 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.286439 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp" event={"ID":"d7410c3b-b346-44e9-8f5a-649c74970dc7","Type":"ContainerDied","Data":"c36b9497b31e05907646ebd042f3b08cdf017f981ea0c4fa1fad81282e3f6bd2"} Apr 22 18:13:30.286494 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.286459 2568 scope.go:117] "RemoveContainer" containerID="3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2" Apr 22 18:13:30.305113 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.305082 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:13:30.308853 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.308830 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-6769776b64-gdzkp"] Apr 22 18:13:30.313357 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.313341 2568 scope.go:117] "RemoveContainer" containerID="be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f" Apr 22 18:13:30.373974 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.373952 2568 scope.go:117] "RemoveContainer" containerID="3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2" Apr 22 18:13:30.374312 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:13:30.374289 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2\": container with ID starting with 3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2 not found: ID does not exist" containerID="3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2" Apr 22 18:13:30.374368 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.374326 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2"} err="failed to get container status \"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2\": rpc error: code = NotFound desc = could not find container \"3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2\": container with ID starting with 3ece5c73e73f431a9dcdbbcbdfb05a7373afe4ebbbf2d6d5346dcd150f10c6a2 not found: ID does not exist" Apr 22 18:13:30.374368 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.374349 2568 scope.go:117] "RemoveContainer" containerID="be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f" Apr 22 18:13:30.374647 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:13:30.374620 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f\": container with ID starting with be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f not found: ID does not exist" containerID="be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f" Apr 22 18:13:30.374706 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.374648 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f"} err="failed to get container status \"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f\": rpc error: code = NotFound desc = could not find container \"be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f\": container with ID starting with be7a4852d8def5c1b634a793a834eb6ebddd5849c1471207db8eeaf631c55b6f not found: ID does not exist" Apr 22 18:13:30.625307 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.625269 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:30.625307 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.625317 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:13:30.626895 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.626865 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:13:30.801993 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.801948 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:13:30.802347 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:30.802320 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" containerID="cri-o://e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad" gracePeriod=30 Apr 22 18:13:32.213399 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:32.213361 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" path="/var/lib/kubelet/pods/d7410c3b-b346-44e9-8f5a-649c74970dc7/volumes" Apr 22 18:13:40.625426 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:40.625372 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:13:50.625703 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:13:50.625612 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:00.625964 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:00.625917 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:01.077099 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.077078 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5d79b9f6dd-ft2r6_169f435d-ab79-4ff4-b63f-8dbf40a8e709/main/0.log" Apr 22 18:14:01.077453 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.077437 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:14:01.084778 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084759 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.084854 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084791 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.084854 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084822 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.084927 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084902 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8tcs\" (UniqueName: \"kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.084967 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084931 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.085061 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.084993 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location\") pod \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\" (UID: \"169f435d-ab79-4ff4-b63f-8dbf40a8e709\") " Apr 22 18:14:01.085199 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.085171 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache" (OuterVolumeSpecName: "model-cache") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:14:01.085302 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.085226 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home" (OuterVolumeSpecName: "home") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:14:01.086999 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.086976 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:14:01.087298 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.087273 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm" (OuterVolumeSpecName: "dshm") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:14:01.087403 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.087274 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs" (OuterVolumeSpecName: "kube-api-access-p8tcs") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "kube-api-access-p8tcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:14:01.153943 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.153874 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "169f435d-ab79-4ff4-b63f-8dbf40a8e709" (UID: "169f435d-ab79-4ff4-b63f-8dbf40a8e709"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185878 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p8tcs\" (UniqueName: \"kubernetes.io/projected/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kube-api-access-p8tcs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185909 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185921 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185930 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185939 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/169f435d-ab79-4ff4-b63f-8dbf40a8e709-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.185966 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.185949 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/169f435d-ab79-4ff4-b63f-8dbf40a8e709-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:14:01.399861 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.399828 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-5d79b9f6dd-ft2r6_169f435d-ab79-4ff4-b63f-8dbf40a8e709/main/0.log" Apr 22 18:14:01.400189 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.400165 2568 generic.go:358] "Generic (PLEG): container finished" podID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerID="e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad" exitCode=137 Apr 22 18:14:01.400272 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.400230 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerDied","Data":"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad"} Apr 22 18:14:01.400272 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.400263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" event={"ID":"169f435d-ab79-4ff4-b63f-8dbf40a8e709","Type":"ContainerDied","Data":"b6fcddb31bb18bde0e2f6e97ede100a5f7f9b9100c1dc6466ea7f303a1891eff"} Apr 22 18:14:01.400388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.400278 2568 scope.go:117] "RemoveContainer" containerID="e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad" Apr 22 18:14:01.400388 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.400237 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6" Apr 22 18:14:01.420564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.420543 2568 scope.go:117] "RemoveContainer" containerID="12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b" Apr 22 18:14:01.425555 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.425528 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:14:01.428332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.428310 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-5d79b9f6dd-ft2r6"] Apr 22 18:14:01.430837 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.430820 2568 scope.go:117] "RemoveContainer" containerID="e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad" Apr 22 18:14:01.431108 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:14:01.431088 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad\": container with ID starting with e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad not found: ID does not exist" containerID="e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad" Apr 22 18:14:01.431155 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.431116 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad"} err="failed to get container status \"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad\": rpc error: code = NotFound desc = could not find container \"e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad\": container with ID starting with e925a8d73e42b697b8edc35c227f35692c1e24057db4989e45c39d2555f46cad not found: ID does not exist" Apr 22 18:14:01.431155 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.431135 2568 scope.go:117] "RemoveContainer" containerID="12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b" Apr 22 18:14:01.431369 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:14:01.431353 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b\": container with ID starting with 12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b not found: ID does not exist" containerID="12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b" Apr 22 18:14:01.431416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:01.431373 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b"} err="failed to get container status \"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b\": rpc error: code = NotFound desc = could not find container \"12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b\": container with ID starting with 12388f3e98bcd4d474a6316e9eecf42dbc8cea313afb24c0dc8b150a4dd72a1b not found: ID does not exist" Apr 22 18:14:02.214132 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:02.214095 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" path="/var/lib/kubelet/pods/169f435d-ab79-4ff4-b63f-8dbf40a8e709/volumes" Apr 22 18:14:10.625509 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:10.625462 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:20.626289 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:20.626241 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:30.625505 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:30.625446 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:40.626420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:40.626369 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:44.006069 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006032 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:14:44.006561 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006532 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" Apr 22 18:14:44.006561 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006562 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006580 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="storage-initializer" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006589 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="storage-initializer" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006598 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="storage-initializer" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006606 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="storage-initializer" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006626 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" Apr 22 18:14:44.006740 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006635 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" Apr 22 18:14:44.006959 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006751 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7410c3b-b346-44e9-8f5a-649c74970dc7" containerName="main" Apr 22 18:14:44.006959 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.006765 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="169f435d-ab79-4ff4-b63f-8dbf40a8e709" containerName="main" Apr 22 18:14:44.015935 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.015911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.019007 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.018986 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 18:14:44.027098 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.027078 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:14:44.151265 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh5w2\" (UniqueName: \"kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.151265 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.151490 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151300 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.151490 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.151490 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.151490 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.151403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252707 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh5w2\" (UniqueName: \"kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252832 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.252925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.252853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.253242 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.253214 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.253400 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.253374 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.253529 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.253507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.255100 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.255079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.255462 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.255441 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.260827 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.260772 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh5w2\" (UniqueName: \"kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.328831 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.328788 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:44.460981 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.460896 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:14:44.464109 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:14:44.464080 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0811462c_f53b_4753_b679_edf6a901258b.slice/crio-c70f7e79eae2f8f39fe16882075a3d50298d8791e764d7547b7cfe6b5c506ca8 WatchSource:0}: Error finding container c70f7e79eae2f8f39fe16882075a3d50298d8791e764d7547b7cfe6b5c506ca8: Status 404 returned error can't find the container with id c70f7e79eae2f8f39fe16882075a3d50298d8791e764d7547b7cfe6b5c506ca8 Apr 22 18:14:44.539299 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.539266 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerStarted","Data":"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979"} Apr 22 18:14:44.539431 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:44.539310 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerStarted","Data":"c70f7e79eae2f8f39fe16882075a3d50298d8791e764d7547b7cfe6b5c506ca8"} Apr 22 18:14:49.560402 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:49.560369 2568 generic.go:358] "Generic (PLEG): container finished" podID="0811462c-f53b-4753-b679-edf6a901258b" containerID="66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979" exitCode=0 Apr 22 18:14:49.560829 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:49.560417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerDied","Data":"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979"} Apr 22 18:14:50.565723 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:50.565687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerStarted","Data":"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919"} Apr 22 18:14:50.590601 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:50.590540 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podStartSLOduration=7.5905257200000005 podStartE2EDuration="7.59052572s" podCreationTimestamp="2026-04-22 18:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:14:50.588600874 +0000 UTC m=+1318.897232362" watchObservedRunningTime="2026-04-22 18:14:50.59052572 +0000 UTC m=+1318.899157203" Apr 22 18:14:50.626179 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:50.626134 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 22 18:14:54.329543 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:54.329506 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:54.330045 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:54.329959 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:14:54.331075 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:14:54.331046 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:00.635451 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:00.635410 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:15:00.643769 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:00.643722 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:15:04.330127 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:04.330070 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:09.185585 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:09.185540 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:15:09.186074 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:09.185859 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" containerID="cri-o://de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59" gracePeriod=30 Apr 22 18:15:14.329241 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:14.329132 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:24.329642 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:24.329579 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:26.840524 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.840490 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:15:26.845449 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.845419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.848417 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.848395 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-nnh5c\"" Apr 22 18:15:26.848691 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.848506 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 18:15:26.849479 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.849453 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:15:26.853128 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.853108 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:26.857140 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.857115 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:15:26.863705 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.863679 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:15:26.928895 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.928864 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.928895 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.928899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.929110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.928918 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.929110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.928981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.929110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.929034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l246\" (UniqueName: \"kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:26.929110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:26.929097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.029904 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.029863 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030081 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.029919 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7l246\" (UniqueName: \"kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030081 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.029983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030081 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030029 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030081 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030299 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030299 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030299 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhsk6\" (UniqueName: \"kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030299 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.030497 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030317 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030497 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030343 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030497 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030363 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030497 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030497 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.030774 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.030692 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.032516 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.032493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.032816 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.032796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.041486 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.041462 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l246\" (UniqueName: \"kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.130865 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.130865 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130822 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhsk6\" (UniqueName: \"kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.130957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131357 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.131328 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131476 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.131369 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.131476 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.131426 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.133210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.133189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.133440 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.133423 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.141012 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.140981 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhsk6\" (UniqueName: \"kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.157878 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.157853 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:27.167311 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.167285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:27.307067 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.307034 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:15:27.310244 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:15:27.310211 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc735b23_ec98_49aa_abfe_d611164abed5.slice/crio-af425b6c6e2f73a14766a61fb7d8f8c08f4c81900c60770514402d0a752b62f9 WatchSource:0}: Error finding container af425b6c6e2f73a14766a61fb7d8f8c08f4c81900c60770514402d0a752b62f9: Status 404 returned error can't find the container with id af425b6c6e2f73a14766a61fb7d8f8c08f4c81900c60770514402d0a752b62f9 Apr 22 18:15:27.335797 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.335768 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:15:27.345117 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:15:27.345083 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode210af59_0cf6_4a50_be65_c0731c2634a6.slice/crio-0a8d37fc2ea49fee53776785ff25fbf9fa38ac2e4120c291e194271ce7b08ff6 WatchSource:0}: Error finding container 0a8d37fc2ea49fee53776785ff25fbf9fa38ac2e4120c291e194271ce7b08ff6: Status 404 returned error can't find the container with id 0a8d37fc2ea49fee53776785ff25fbf9fa38ac2e4120c291e194271ce7b08ff6 Apr 22 18:15:27.699594 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.699553 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerStarted","Data":"0a8d37fc2ea49fee53776785ff25fbf9fa38ac2e4120c291e194271ce7b08ff6"} Apr 22 18:15:27.701499 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.701466 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerStarted","Data":"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59"} Apr 22 18:15:27.701499 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:27.701506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerStarted","Data":"af425b6c6e2f73a14766a61fb7d8f8c08f4c81900c60770514402d0a752b62f9"} Apr 22 18:15:28.708574 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:28.708534 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerStarted","Data":"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db"} Apr 22 18:15:28.709027 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:28.708825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:29.715174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:29.715126 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerStarted","Data":"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350"} Apr 22 18:15:32.728161 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:32.728121 2568 generic.go:358] "Generic (PLEG): container finished" podID="dc735b23-ec98-49aa-abfe-d611164abed5" containerID="a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59" exitCode=0 Apr 22 18:15:32.728697 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:32.728180 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerDied","Data":"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59"} Apr 22 18:15:33.733517 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:33.733479 2568 generic.go:358] "Generic (PLEG): container finished" podID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerID="60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350" exitCode=0 Apr 22 18:15:33.733960 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:33.733547 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerDied","Data":"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350"} Apr 22 18:15:33.735680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:33.735645 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerStarted","Data":"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269"} Apr 22 18:15:33.772061 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:33.772005 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podStartSLOduration=7.771986009 podStartE2EDuration="7.771986009s" podCreationTimestamp="2026-04-22 18:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:15:33.771329693 +0000 UTC m=+1362.079961180" watchObservedRunningTime="2026-04-22 18:15:33.771986009 +0000 UTC m=+1362.080617497" Apr 22 18:15:34.329859 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:34.329806 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:34.742441 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:34.742395 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerStarted","Data":"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603"} Apr 22 18:15:34.767448 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:34.767381 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podStartSLOduration=7.861229763 podStartE2EDuration="8.767362371s" podCreationTimestamp="2026-04-22 18:15:26 +0000 UTC" firstStartedPulling="2026-04-22 18:15:27.357415814 +0000 UTC m=+1355.666047296" lastFinishedPulling="2026-04-22 18:15:28.263548425 +0000 UTC m=+1356.572179904" observedRunningTime="2026-04-22 18:15:34.76381673 +0000 UTC m=+1363.072448216" watchObservedRunningTime="2026-04-22 18:15:34.767362371 +0000 UTC m=+1363.075993858" Apr 22 18:15:37.158346 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.158293 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:37.158866 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.158372 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:37.160001 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.159963 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:15:37.168296 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.168270 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:37.168382 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.168315 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:15:37.169536 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:37.169504 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:15:39.510589 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.510552 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-f99c8d868-gtdgr_c20bef54-af40-4a86-99b1-e786c054d103/main/0.log" Apr 22 18:15:39.511022 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.510994 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:15:39.644121 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644075 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644121 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644129 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644367 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644176 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644367 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644214 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644367 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644244 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gcq\" (UniqueName: \"kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644367 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644326 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs\") pod \"c20bef54-af40-4a86-99b1-e786c054d103\" (UID: \"c20bef54-af40-4a86-99b1-e786c054d103\") " Apr 22 18:15:39.644578 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644479 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache" (OuterVolumeSpecName: "model-cache") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:39.644654 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.644633 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.645277 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.645249 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home" (OuterVolumeSpecName: "home") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:39.647051 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.647018 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq" (OuterVolumeSpecName: "kube-api-access-z4gcq") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "kube-api-access-z4gcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:15:39.647563 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.647540 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:15:39.647660 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.647558 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm" (OuterVolumeSpecName: "dshm") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:39.716922 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.716866 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c20bef54-af40-4a86-99b1-e786c054d103" (UID: "c20bef54-af40-4a86-99b1-e786c054d103"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:15:39.745464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.745373 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.745464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.745415 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4gcq\" (UniqueName: \"kubernetes.io/projected/c20bef54-af40-4a86-99b1-e786c054d103-kube-api-access-z4gcq\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.745464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.745433 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c20bef54-af40-4a86-99b1-e786c054d103-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.745464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.745446 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.745464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.745459 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c20bef54-af40-4a86-99b1-e786c054d103-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:15:39.764244 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764209 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-f99c8d868-gtdgr_c20bef54-af40-4a86-99b1-e786c054d103/main/0.log" Apr 22 18:15:39.764632 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764601 2568 generic.go:358] "Generic (PLEG): container finished" podID="c20bef54-af40-4a86-99b1-e786c054d103" containerID="de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59" exitCode=137 Apr 22 18:15:39.764771 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764694 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerDied","Data":"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59"} Apr 22 18:15:39.764771 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764751 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" Apr 22 18:15:39.764771 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764760 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr" event={"ID":"c20bef54-af40-4a86-99b1-e786c054d103","Type":"ContainerDied","Data":"2dcdf2765147d39fa399796ccab6f73bde6d960169136839676a612d7891431a"} Apr 22 18:15:39.764941 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.764782 2568 scope.go:117] "RemoveContainer" containerID="de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59" Apr 22 18:15:39.790302 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.790272 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:15:39.795851 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.795290 2568 scope.go:117] "RemoveContainer" containerID="41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e" Apr 22 18:15:39.798070 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.798042 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-f99c8d868-gtdgr"] Apr 22 18:15:39.895068 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.895040 2568 scope.go:117] "RemoveContainer" containerID="de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59" Apr 22 18:15:39.895468 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:15:39.895440 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59\": container with ID starting with de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59 not found: ID does not exist" containerID="de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59" Apr 22 18:15:39.895537 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.895484 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59"} err="failed to get container status \"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59\": rpc error: code = NotFound desc = could not find container \"de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59\": container with ID starting with de601c5c6f5bbf5811b26183afb11b457095a774d1dc9d27cf7fc176b9c41c59 not found: ID does not exist" Apr 22 18:15:39.895537 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.895513 2568 scope.go:117] "RemoveContainer" containerID="41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e" Apr 22 18:15:39.895958 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:15:39.895936 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e\": container with ID starting with 41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e not found: ID does not exist" containerID="41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e" Apr 22 18:15:39.896042 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:39.895967 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e"} err="failed to get container status \"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e\": rpc error: code = NotFound desc = could not find container \"41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e\": container with ID starting with 41fa1becdf95e0e3b2f4f6acbf903ca7600d112df0384c1c171632d28fff436e not found: ID does not exist" Apr 22 18:15:40.214798 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:40.214759 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c20bef54-af40-4a86-99b1-e786c054d103" path="/var/lib/kubelet/pods/c20bef54-af40-4a86-99b1-e786c054d103/volumes" Apr 22 18:15:44.330107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:44.330060 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:47.159368 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:47.159322 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:15:47.167900 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:47.167866 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:15:47.772609 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:47.772579 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:15:54.330304 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:54.330247 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:15:57.158892 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:57.158820 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:15:57.168598 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:15:57.168552 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:04.329587 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:04.329532 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:16:07.158925 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:07.158871 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:07.168139 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:07.168096 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:14.329592 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:14.329535 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 22 18:16:17.158771 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:17.158699 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:17.168681 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:17.168645 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:24.339826 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:24.339791 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:16:24.347479 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:24.347451 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:16:27.159075 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:27.159026 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:27.168392 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:27.168361 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:35.014456 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:35.014424 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:16:35.014896 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:35.014701 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" containerID="cri-o://d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919" gracePeriod=30 Apr 22 18:16:37.158598 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:37.158531 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:37.167897 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:37.167856 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:45.361039 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.360942 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:16:45.361387 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.361372 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" Apr 22 18:16:45.361428 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.361390 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" Apr 22 18:16:45.361428 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.361407 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="storage-initializer" Apr 22 18:16:45.361428 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.361415 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="storage-initializer" Apr 22 18:16:45.361591 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.361510 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c20bef54-af40-4a86-99b1-e786c054d103" containerName="main" Apr 22 18:16:45.369386 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.369362 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.380015 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.379989 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-v2cmz\"" Apr 22 18:16:45.380983 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.380967 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 18:16:45.423397 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.423366 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:16:45.451102 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451075 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.451247 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451114 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.451247 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.451361 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.451361 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451325 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.451453 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.451364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k59mn\" (UniqueName: \"kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.552827 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.552788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.552827 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.552832 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k59mn\" (UniqueName: \"kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553082 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.552889 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553082 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.552939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553082 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.552962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553245 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.553125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553245 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.553169 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553351 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.553314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.553438 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.553420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.555274 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.555246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.555410 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.555397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.596279 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.596254 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k59mn\" (UniqueName: \"kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.679525 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.679429 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:45.926216 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:45.926189 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:16:45.928952 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:16:45.928925 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05613f1e_73cb_4902_bc64_d66a6062bf0d.slice/crio-ab91ac4d9cf746660daafedc38c5801e0e875012b6fe90d2e2b3735639615672 WatchSource:0}: Error finding container ab91ac4d9cf746660daafedc38c5801e0e875012b6fe90d2e2b3735639615672: Status 404 returned error can't find the container with id ab91ac4d9cf746660daafedc38c5801e0e875012b6fe90d2e2b3735639615672 Apr 22 18:16:46.020815 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:46.020775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerStarted","Data":"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008"} Apr 22 18:16:46.020952 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:46.020822 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerStarted","Data":"ab91ac4d9cf746660daafedc38c5801e0e875012b6fe90d2e2b3735639615672"} Apr 22 18:16:47.158792 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:47.158743 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:47.168310 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:47.168258 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:16:51.039323 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:51.039286 2568 generic.go:358] "Generic (PLEG): container finished" podID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerID="9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008" exitCode=0 Apr 22 18:16:51.039682 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:51.039346 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerDied","Data":"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008"} Apr 22 18:16:52.044917 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:52.044881 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerStarted","Data":"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727"} Apr 22 18:16:52.086574 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:52.086495 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.086474222 podStartE2EDuration="7.086474222s" podCreationTimestamp="2026-04-22 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:16:52.086295908 +0000 UTC m=+1440.394927396" watchObservedRunningTime="2026-04-22 18:16:52.086474222 +0000 UTC m=+1440.395105709" Apr 22 18:16:55.679746 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:55.679693 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:16:55.681146 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:55.681113 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:16:57.159325 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:57.159272 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:16:57.168137 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:16:57.168096 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:05.474504 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.474473 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k_0811462c-f53b-4753-b679-edf6a901258b/main/0.log" Apr 22 18:17:05.474961 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.474943 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:17:05.543320 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543290 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh5w2\" (UniqueName: \"kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543482 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543373 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543482 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543399 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543482 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543428 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543482 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543461 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543714 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543503 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location\") pod \"0811462c-f53b-4753-b679-edf6a901258b\" (UID: \"0811462c-f53b-4753-b679-edf6a901258b\") " Apr 22 18:17:05.543802 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543710 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache" (OuterVolumeSpecName: "model-cache") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:17:05.543903 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.543865 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home" (OuterVolumeSpecName: "home") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:17:05.545776 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.545687 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm" (OuterVolumeSpecName: "dshm") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:17:05.545776 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.545712 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2" (OuterVolumeSpecName: "kube-api-access-xh5w2") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "kube-api-access-xh5w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:17:05.545991 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.545967 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:17:05.640383 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.640339 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0811462c-f53b-4753-b679-edf6a901258b" (UID: "0811462c-f53b-4753-b679-edf6a901258b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:17:05.644959 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.644929 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.645090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.644965 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xh5w2\" (UniqueName: \"kubernetes.io/projected/0811462c-f53b-4753-b679-edf6a901258b-kube-api-access-xh5w2\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.645090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.644981 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0811462c-f53b-4753-b679-edf6a901258b-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.645090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.644995 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.645090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.645009 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.645090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.645021 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0811462c-f53b-4753-b679-edf6a901258b-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:17:05.680066 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:05.680031 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:06.099552 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.099519 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k_0811462c-f53b-4753-b679-edf6a901258b/main/0.log" Apr 22 18:17:06.099929 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.099904 2568 generic.go:358] "Generic (PLEG): container finished" podID="0811462c-f53b-4753-b679-edf6a901258b" containerID="d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919" exitCode=137 Apr 22 18:17:06.100040 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.099952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerDied","Data":"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919"} Apr 22 18:17:06.100040 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.099992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" event={"ID":"0811462c-f53b-4753-b679-edf6a901258b","Type":"ContainerDied","Data":"c70f7e79eae2f8f39fe16882075a3d50298d8791e764d7547b7cfe6b5c506ca8"} Apr 22 18:17:06.100040 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.100001 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k" Apr 22 18:17:06.100040 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.100014 2568 scope.go:117] "RemoveContainer" containerID="d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919" Apr 22 18:17:06.122593 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.122565 2568 scope.go:117] "RemoveContainer" containerID="66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979" Apr 22 18:17:06.127482 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.127455 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:17:06.133408 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.133378 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7864657558qjg4k"] Apr 22 18:17:06.134582 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.134561 2568 scope.go:117] "RemoveContainer" containerID="d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919" Apr 22 18:17:06.134913 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:17:06.134891 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919\": container with ID starting with d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919 not found: ID does not exist" containerID="d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919" Apr 22 18:17:06.135019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.134925 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919"} err="failed to get container status \"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919\": rpc error: code = NotFound desc = could not find container \"d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919\": container with ID starting with d89510fb6beaa3631200be0163f83b31b3d45448c95de672609b77451b475919 not found: ID does not exist" Apr 22 18:17:06.135019 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.134949 2568 scope.go:117] "RemoveContainer" containerID="66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979" Apr 22 18:17:06.135249 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:17:06.135230 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979\": container with ID starting with 66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979 not found: ID does not exist" containerID="66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979" Apr 22 18:17:06.135297 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.135257 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979"} err="failed to get container status \"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979\": rpc error: code = NotFound desc = could not find container \"66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979\": container with ID starting with 66ca30b2f7eb9c9f811bc38b19ac3cb64e880cb93a962b96c6ab9e2ae065c979 not found: ID does not exist" Apr 22 18:17:06.212766 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:06.212710 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0811462c-f53b-4753-b679-edf6a901258b" path="/var/lib/kubelet/pods/0811462c-f53b-4753-b679-edf6a901258b/volumes" Apr 22 18:17:07.158651 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:07.158600 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:17:07.167902 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:07.167857 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:15.680552 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:15.680509 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:17:15.681047 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:15.680767 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:17.159131 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:17.159080 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:17:17.167813 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:17.167780 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:25.680686 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:25.680630 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:27.158676 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:27.158628 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:17:27.168134 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:27.168091 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:35.680302 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:35.680248 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:37.158569 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:37.158505 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:17:37.168218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:37.168178 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:45.680090 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:45.680025 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:47.159157 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:47.159105 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8001/health\": dial tcp 10.132.0.36:8001: connect: connection refused" Apr 22 18:17:47.168138 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:47.168099 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8000/health\": dial tcp 10.132.0.37:8000: connect: connection refused" Apr 22 18:17:52.213049 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:52.213015 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:17:52.215062 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:52.215037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:17:55.679897 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:55.679852 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:17:57.167880 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:57.167833 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:17:57.177844 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:57.177814 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:17:57.180559 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:57.180538 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:17:57.185632 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:17:57.185609 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:18:05.680800 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:05.680750 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:18:14.964147 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:14.964030 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:18:14.968207 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:14.966702 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" containerID="cri-o://1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" gracePeriod=30 Apr 22 18:18:14.969548 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:14.969518 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:18:14.971115 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:14.971060 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" containerID="cri-o://c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269" gracePeriod=30 Apr 22 18:18:15.680444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:15.680398 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:18:25.680597 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:25.680557 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:18:35.680122 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:35.680080 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 22 18:18:44.966880 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:44.966802 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="llm-d-routing-sidecar" containerID="cri-o://4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" gracePeriod=2 Apr 22 18:18:45.322114 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.322089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp_e210af59-0cf6-4a50-be65-c0731c2634a6/main/0.log" Apr 22 18:18:45.322772 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.322752 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:18:45.325342 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.325326 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:18:45.429772 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429722 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.429772 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429775 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429816 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429852 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429878 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429908 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429931 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.429953 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l246\" (UniqueName: \"kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246\") pod \"e210af59-0cf6-4a50-be65-c0731c2634a6\" (UID: \"e210af59-0cf6-4a50-be65-c0731c2634a6\") " Apr 22 18:18:45.430017 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430014 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430051 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430053 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache" (OuterVolumeSpecName: "model-cache") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430075 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhsk6\" (UniqueName: \"kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430138 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home\") pod \"dc735b23-ec98-49aa-abfe-d611164abed5\" (UID: \"dc735b23-ec98-49aa-abfe-d611164abed5\") " Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430153 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache" (OuterVolumeSpecName: "model-cache") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.430463 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430278 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home" (OuterVolumeSpecName: "home") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.430818 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430505 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.430818 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430533 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.430818 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430561 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.430818 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.430645 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home" (OuterVolumeSpecName: "home") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.432510 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.432443 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6" (OuterVolumeSpecName: "kube-api-access-fhsk6") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "kube-api-access-fhsk6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:45.432510 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.432453 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:45.432702 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.432535 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246" (OuterVolumeSpecName: "kube-api-access-7l246") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "kube-api-access-7l246". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:45.432817 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.432785 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:45.433027 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.432834 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm" (OuterVolumeSpecName: "dshm") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.433641 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.433618 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm" (OuterVolumeSpecName: "dshm") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.447851 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.447827 2568 generic.go:358] "Generic (PLEG): container finished" podID="dc735b23-ec98-49aa-abfe-d611164abed5" containerID="c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269" exitCode=137 Apr 22 18:18:45.447978 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.447913 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" Apr 22 18:18:45.447978 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.447908 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerDied","Data":"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269"} Apr 22 18:18:45.448095 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.447994 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t" event={"ID":"dc735b23-ec98-49aa-abfe-d611164abed5","Type":"ContainerDied","Data":"af425b6c6e2f73a14766a61fb7d8f8c08f4c81900c60770514402d0a752b62f9"} Apr 22 18:18:45.448095 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.448020 2568 scope.go:117] "RemoveContainer" containerID="c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269" Apr 22 18:18:45.449313 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.449298 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp_e210af59-0cf6-4a50-be65-c0731c2634a6/main/0.log" Apr 22 18:18:45.449952 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.449929 2568 generic.go:358] "Generic (PLEG): container finished" podID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerID="1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" exitCode=137 Apr 22 18:18:45.449952 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.449950 2568 generic.go:358] "Generic (PLEG): container finished" podID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerID="4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" exitCode=0 Apr 22 18:18:45.450107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.449973 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerDied","Data":"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603"} Apr 22 18:18:45.450107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.449993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerDied","Data":"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db"} Apr 22 18:18:45.450107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.450003 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" event={"ID":"e210af59-0cf6-4a50-be65-c0731c2634a6","Type":"ContainerDied","Data":"0a8d37fc2ea49fee53776785ff25fbf9fa38ac2e4120c291e194271ce7b08ff6"} Apr 22 18:18:45.450107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.450030 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp" Apr 22 18:18:45.469221 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.469201 2568 scope.go:117] "RemoveContainer" containerID="a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59" Apr 22 18:18:45.492719 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.492665 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e210af59-0cf6-4a50-be65-c0731c2634a6" (UID: "e210af59-0cf6-4a50-be65-c0731c2634a6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.494149 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.494124 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dc735b23-ec98-49aa-abfe-d611164abed5" (UID: "dc735b23-ec98-49aa-abfe-d611164abed5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:45.531817 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531785 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e210af59-0cf6-4a50-be65-c0731c2634a6-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.531817 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531810 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531823 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531838 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e210af59-0cf6-4a50-be65-c0731c2634a6-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531852 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7l246\" (UniqueName: \"kubernetes.io/projected/e210af59-0cf6-4a50-be65-c0731c2634a6-kube-api-access-7l246\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531863 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc735b23-ec98-49aa-abfe-d611164abed5-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531878 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531890 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhsk6\" (UniqueName: \"kubernetes.io/projected/dc735b23-ec98-49aa-abfe-d611164abed5-kube-api-access-fhsk6\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.532002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.531903 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dc735b23-ec98-49aa-abfe-d611164abed5-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:45.533627 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.533610 2568 scope.go:117] "RemoveContainer" containerID="c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269" Apr 22 18:18:45.533969 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:45.533950 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269\": container with ID starting with c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269 not found: ID does not exist" containerID="c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269" Apr 22 18:18:45.534024 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.533980 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269"} err="failed to get container status \"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269\": rpc error: code = NotFound desc = could not find container \"c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269\": container with ID starting with c6b374481004071f4f7840799692a1dd753795ce962f6a1c9aee5bce78881269 not found: ID does not exist" Apr 22 18:18:45.534024 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.533999 2568 scope.go:117] "RemoveContainer" containerID="a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59" Apr 22 18:18:45.534243 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:45.534225 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59\": container with ID starting with a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59 not found: ID does not exist" containerID="a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59" Apr 22 18:18:45.534304 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.534252 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59"} err="failed to get container status \"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59\": rpc error: code = NotFound desc = could not find container \"a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59\": container with ID starting with a0321f6033f831e8b16e86500a5d59b27db4b61d51079ef00bb5738d2cddfc59 not found: ID does not exist" Apr 22 18:18:45.534304 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.534276 2568 scope.go:117] "RemoveContainer" containerID="1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" Apr 22 18:18:45.554852 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.554829 2568 scope.go:117] "RemoveContainer" containerID="60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350" Apr 22 18:18:45.614200 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.614177 2568 scope.go:117] "RemoveContainer" containerID="4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" Apr 22 18:18:45.621695 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.621675 2568 scope.go:117] "RemoveContainer" containerID="1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" Apr 22 18:18:45.621994 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:45.621973 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603\": container with ID starting with 1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603 not found: ID does not exist" containerID="1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" Apr 22 18:18:45.622068 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622008 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603"} err="failed to get container status \"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603\": rpc error: code = NotFound desc = could not find container \"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603\": container with ID starting with 1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603 not found: ID does not exist" Apr 22 18:18:45.622068 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622037 2568 scope.go:117] "RemoveContainer" containerID="60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350" Apr 22 18:18:45.622286 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:45.622268 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350\": container with ID starting with 60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350 not found: ID does not exist" containerID="60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350" Apr 22 18:18:45.622326 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622292 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350"} err="failed to get container status \"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350\": rpc error: code = NotFound desc = could not find container \"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350\": container with ID starting with 60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350 not found: ID does not exist" Apr 22 18:18:45.622326 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622307 2568 scope.go:117] "RemoveContainer" containerID="4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" Apr 22 18:18:45.622539 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:45.622516 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db\": container with ID starting with 4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db not found: ID does not exist" containerID="4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" Apr 22 18:18:45.622592 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622551 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db"} err="failed to get container status \"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db\": rpc error: code = NotFound desc = could not find container \"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db\": container with ID starting with 4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db not found: ID does not exist" Apr 22 18:18:45.622592 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622567 2568 scope.go:117] "RemoveContainer" containerID="1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603" Apr 22 18:18:45.622790 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622769 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603"} err="failed to get container status \"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603\": rpc error: code = NotFound desc = could not find container \"1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603\": container with ID starting with 1a6c82578cdfe354eaf9fc622731222ba74b40bfdbf96e4e3e2599759d2e4603 not found: ID does not exist" Apr 22 18:18:45.622845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.622794 2568 scope.go:117] "RemoveContainer" containerID="60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350" Apr 22 18:18:45.623053 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.623035 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350"} err="failed to get container status \"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350\": rpc error: code = NotFound desc = could not find container \"60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350\": container with ID starting with 60f3c716c667b68411415f9d603d829849a51196a2bedf35b39ff6dc0e54f350 not found: ID does not exist" Apr 22 18:18:45.623101 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.623054 2568 scope.go:117] "RemoveContainer" containerID="4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db" Apr 22 18:18:45.623274 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.623255 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db"} err="failed to get container status \"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db\": rpc error: code = NotFound desc = could not find container \"4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db\": container with ID starting with 4a5b1f0df7940d9a4db7eaf990648fb95386d5f7a9e34c846fb01c302f5fd5db not found: ID does not exist" Apr 22 18:18:45.689680 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.689654 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:18:45.697149 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.697127 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:18:45.777835 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.777805 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:18:45.782751 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.782697 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5dc98d6cd5fxgqp"] Apr 22 18:18:45.793614 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.793588 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:18:45.797836 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:45.797814 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6cb956t"] Apr 22 18:18:46.213769 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:46.213711 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" path="/var/lib/kubelet/pods/dc735b23-ec98-49aa-abfe-d611164abed5/volumes" Apr 22 18:18:46.214355 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:46.214333 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" path="/var/lib/kubelet/pods/e210af59-0cf6-4a50-be65-c0731c2634a6/volumes" Apr 22 18:18:53.109615 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:53.109579 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:18:53.110075 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:53.110046 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" containerID="cri-o://5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727" gracePeriod=30 Apr 22 18:18:53.942685 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:53.942655 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:18:54.003416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003331 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003391 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003416 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003416 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k59mn\" (UniqueName: \"kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003659 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003468 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003659 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003506 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003659 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003541 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home\") pod \"05613f1e-73cb-4902-bc64-d66a6062bf0d\" (UID: \"05613f1e-73cb-4902-bc64-d66a6062bf0d\") " Apr 22 18:18:54.003659 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003608 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache" (OuterVolumeSpecName: "model-cache") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.003898 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003791 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.003971 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.003953 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home" (OuterVolumeSpecName: "home") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.005617 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.005586 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm" (OuterVolumeSpecName: "dshm") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.006108 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.006078 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:18:54.006213 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.006133 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn" (OuterVolumeSpecName: "kube-api-access-k59mn") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "kube-api-access-k59mn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:18:54.061438 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.061395 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05613f1e-73cb-4902-bc64-d66a6062bf0d" (UID: "05613f1e-73cb-4902-bc64-d66a6062bf0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:18:54.105108 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.105073 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.105108 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.105105 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.105262 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.105116 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05613f1e-73cb-4902-bc64-d66a6062bf0d-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.105262 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.105128 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k59mn\" (UniqueName: \"kubernetes.io/projected/05613f1e-73cb-4902-bc64-d66a6062bf0d-kube-api-access-k59mn\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.105262 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.105139 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/05613f1e-73cb-4902-bc64-d66a6062bf0d-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:18:54.484962 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.484933 2568 generic.go:358] "Generic (PLEG): container finished" podID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerID="5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727" exitCode=0 Apr 22 18:18:54.485332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.484969 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerDied","Data":"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727"} Apr 22 18:18:54.485332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.484991 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"05613f1e-73cb-4902-bc64-d66a6062bf0d","Type":"ContainerDied","Data":"ab91ac4d9cf746660daafedc38c5801e0e875012b6fe90d2e2b3735639615672"} Apr 22 18:18:54.485332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.485006 2568 scope.go:117] "RemoveContainer" containerID="5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727" Apr 22 18:18:54.485332 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.485021 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 22 18:18:54.501335 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.501307 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:18:54.502671 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.502649 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 22 18:18:54.504877 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.504859 2568 scope.go:117] "RemoveContainer" containerID="9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008" Apr 22 18:18:54.565375 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.565351 2568 scope.go:117] "RemoveContainer" containerID="5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727" Apr 22 18:18:54.565716 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:54.565694 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727\": container with ID starting with 5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727 not found: ID does not exist" containerID="5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727" Apr 22 18:18:54.565779 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.565742 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727"} err="failed to get container status \"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727\": rpc error: code = NotFound desc = could not find container \"5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727\": container with ID starting with 5c063c600836652eb573848bea58099d0c6a547bb93c4267b9639f54a4f9f727 not found: ID does not exist" Apr 22 18:18:54.565779 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.565768 2568 scope.go:117] "RemoveContainer" containerID="9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008" Apr 22 18:18:54.566076 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:18:54.566048 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008\": container with ID starting with 9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008 not found: ID does not exist" containerID="9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008" Apr 22 18:18:54.566123 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:54.566076 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008"} err="failed to get container status \"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008\": rpc error: code = NotFound desc = could not find container \"9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008\": container with ID starting with 9759b5c74e34a51cc26262ccb0e0a28674f8d03bda8a95c6ef88bfc152939008 not found: ID does not exist" Apr 22 18:18:56.213552 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:18:56.213519 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" path="/var/lib/kubelet/pods/05613f1e-73cb-4902-bc64-d66a6062bf0d/volumes" Apr 22 18:19:00.711028 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.710994 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711297 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711307 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711316 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711323 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711338 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711344 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711352 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711358 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711364 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711369 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711379 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711384 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711390 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="llm-d-routing-sidecar" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711396 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="llm-d-routing-sidecar" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711403 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711409 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="storage-initializer" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711415 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" Apr 22 18:19:00.711420 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711420 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" Apr 22 18:19:00.711947 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711466 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc735b23-ec98-49aa-abfe-d611164abed5" containerName="main" Apr 22 18:19:00.711947 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711475 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="llm-d-routing-sidecar" Apr 22 18:19:00.711947 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711481 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="05613f1e-73cb-4902-bc64-d66a6062bf0d" containerName="main" Apr 22 18:19:00.711947 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711488 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0811462c-f53b-4753-b679-edf6a901258b" containerName="main" Apr 22 18:19:00.711947 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.711493 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e210af59-0cf6-4a50-be65-c0731c2634a6" containerName="main" Apr 22 18:19:00.714918 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.714898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.720112 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.720087 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:19:00.720112 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.720090 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:19:00.720357 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.720129 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 18:19:00.720357 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.720159 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:19:00.728136 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.728105 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:00.758892 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.758855 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhjh7\" (UniqueName: \"kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.759004 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.758915 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.759004 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.758956 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.759004 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.758977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.759106 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.759016 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.759106 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.759049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860160 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhjh7\" (UniqueName: \"kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860301 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860577 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860690 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.860781 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.860680 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.862538 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.862505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.862717 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.862703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:00.869095 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:00.869076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhjh7\" (UniqueName: \"kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7\") pod \"scheduler-inline-config-test-kserve-75fb8ff865-284gf\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:01.027231 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:01.027143 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:01.153266 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:01.153235 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:01.155143 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:19:01.155110 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7737a699_94c6_46a3_8dda_c93896239f27.slice/crio-19cd79c0a32758d6e6c03e20522f2b7cc17bd119e86f894d8fea9d71bf6706f3 WatchSource:0}: Error finding container 19cd79c0a32758d6e6c03e20522f2b7cc17bd119e86f894d8fea9d71bf6706f3: Status 404 returned error can't find the container with id 19cd79c0a32758d6e6c03e20522f2b7cc17bd119e86f894d8fea9d71bf6706f3 Apr 22 18:19:01.156997 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:01.156982 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:19:01.508703 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:01.508668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerStarted","Data":"ec55861018e093ca5115343cf43b902f47e036111b5763d93c275ac8df4bd5c8"} Apr 22 18:19:01.508703 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:01.508705 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerStarted","Data":"19cd79c0a32758d6e6c03e20522f2b7cc17bd119e86f894d8fea9d71bf6706f3"} Apr 22 18:19:06.526396 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:06.526362 2568 generic.go:358] "Generic (PLEG): container finished" podID="7737a699-94c6-46a3-8dda-c93896239f27" containerID="ec55861018e093ca5115343cf43b902f47e036111b5763d93c275ac8df4bd5c8" exitCode=0 Apr 22 18:19:06.526914 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:06.526438 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerDied","Data":"ec55861018e093ca5115343cf43b902f47e036111b5763d93c275ac8df4bd5c8"} Apr 22 18:19:07.530669 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:07.530635 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerStarted","Data":"2c0ac32c1f69dd02e2ecb137646c9843c3ee06d4dfe66ef3f4c3f72b6cc63c6c"} Apr 22 18:19:07.550714 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:07.550661 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" podStartSLOduration=7.5506474709999996 podStartE2EDuration="7.550647471s" podCreationTimestamp="2026-04-22 18:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:19:07.549988445 +0000 UTC m=+1575.858619932" watchObservedRunningTime="2026-04-22 18:19:07.550647471 +0000 UTC m=+1575.859278959" Apr 22 18:19:11.027314 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:11.027273 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:11.027314 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:11.027313 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:11.039883 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:11.039858 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:11.555457 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:11.555426 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:33.431663 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.431632 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:33.432119 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.431935 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="main" containerID="cri-o://2c0ac32c1f69dd02e2ecb137646c9843c3ee06d4dfe66ef3f4c3f72b6cc63c6c" gracePeriod=30 Apr 22 18:19:33.614484 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.614436 2568 generic.go:358] "Generic (PLEG): container finished" podID="7737a699-94c6-46a3-8dda-c93896239f27" containerID="2c0ac32c1f69dd02e2ecb137646c9843c3ee06d4dfe66ef3f4c3f72b6cc63c6c" exitCode=0 Apr 22 18:19:33.614717 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.614557 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerDied","Data":"2c0ac32c1f69dd02e2ecb137646c9843c3ee06d4dfe66ef3f4c3f72b6cc63c6c"} Apr 22 18:19:33.679006 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.678983 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:33.735127 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735097 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735146 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735190 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhjh7\" (UniqueName: \"kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735205 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735226 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735249 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm\") pod \"7737a699-94c6-46a3-8dda-c93896239f27\" (UID: \"7737a699-94c6-46a3-8dda-c93896239f27\") " Apr 22 18:19:33.735565 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735476 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache" (OuterVolumeSpecName: "model-cache") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:33.735565 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.735517 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home" (OuterVolumeSpecName: "home") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:33.737291 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.737257 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm" (OuterVolumeSpecName: "dshm") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:33.737291 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.737270 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7" (OuterVolumeSpecName: "kube-api-access-xhjh7") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "kube-api-access-xhjh7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:19:33.737448 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.737330 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:19:33.800879 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.800845 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7737a699-94c6-46a3-8dda-c93896239f27" (UID: "7737a699-94c6-46a3-8dda-c93896239f27"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:19:33.836500 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836477 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xhjh7\" (UniqueName: \"kubernetes.io/projected/7737a699-94c6-46a3-8dda-c93896239f27-kube-api-access-xhjh7\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:33.836500 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836501 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:33.836613 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836511 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:33.836613 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836520 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:33.836613 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836527 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7737a699-94c6-46a3-8dda-c93896239f27-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:33.836613 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:33.836538 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7737a699-94c6-46a3-8dda-c93896239f27-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:19:34.619139 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.619056 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" Apr 22 18:19:34.619139 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.619071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf" event={"ID":"7737a699-94c6-46a3-8dda-c93896239f27","Type":"ContainerDied","Data":"19cd79c0a32758d6e6c03e20522f2b7cc17bd119e86f894d8fea9d71bf6706f3"} Apr 22 18:19:34.619139 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.619121 2568 scope.go:117] "RemoveContainer" containerID="2c0ac32c1f69dd02e2ecb137646c9843c3ee06d4dfe66ef3f4c3f72b6cc63c6c" Apr 22 18:19:34.627278 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.627261 2568 scope.go:117] "RemoveContainer" containerID="ec55861018e093ca5115343cf43b902f47e036111b5763d93c275ac8df4bd5c8" Apr 22 18:19:34.637161 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.637141 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:34.640765 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:34.640745 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-75fb8ff865-284gf"] Apr 22 18:19:36.212580 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:19:36.212546 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7737a699-94c6-46a3-8dda-c93896239f27" path="/var/lib/kubelet/pods/7737a699-94c6-46a3-8dda-c93896239f27/volumes" Apr 22 18:21:14.071650 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.071428 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj"] Apr 22 18:21:14.072238 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.071911 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="main" Apr 22 18:21:14.072238 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.071927 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="main" Apr 22 18:21:14.072238 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.071942 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="storage-initializer" Apr 22 18:21:14.072238 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.071948 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="storage-initializer" Apr 22 18:21:14.072238 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.072004 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7737a699-94c6-46a3-8dda-c93896239f27" containerName="main" Apr 22 18:21:14.075204 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.075184 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.086471 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.086443 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 22 18:21:14.086471 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.086462 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:21:14.087531 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.087516 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:21:14.087581 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.087544 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-lp6mp\"" Apr 22 18:21:14.127099 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.127069 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj"] Apr 22 18:21:14.145192 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145311 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145311 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145311 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145321 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145343 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145365 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52wk\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-kube-api-access-t52wk\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145390 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.145444 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.145429 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246395 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246369 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t52wk\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-kube-api-access-t52wk\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246624 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246677 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246920 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246920 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246757 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246920 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246820 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.246920 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.246829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.247107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.247041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.247107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.247101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.247245 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.247227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.248785 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.248764 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.248985 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.248969 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.258020 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.257993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.258280 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.258265 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52wk\" (UniqueName: \"kubernetes.io/projected/d2f9b8f6-1e8c-4f74-aaab-966f318fa93c-kube-api-access-t52wk\") pod \"router-gateway-2-openshift-default-6866b85949-78gbj\" (UID: \"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.385424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.385327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:14.590809 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.590784 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj"] Apr 22 18:21:14.593151 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:21:14.593126 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f9b8f6_1e8c_4f74_aaab_966f318fa93c.slice/crio-17b1c9a13233ea2bb8b51a45bccdeb607df392940aee83f2d767abff3df4fb35 WatchSource:0}: Error finding container 17b1c9a13233ea2bb8b51a45bccdeb607df392940aee83f2d767abff3df4fb35: Status 404 returned error can't find the container with id 17b1c9a13233ea2bb8b51a45bccdeb607df392940aee83f2d767abff3df4fb35 Apr 22 18:21:14.936380 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:14.936345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" event={"ID":"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c","Type":"ContainerStarted","Data":"17b1c9a13233ea2bb8b51a45bccdeb607df392940aee83f2d767abff3df4fb35"} Apr 22 18:21:17.065599 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:17.065567 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:21:17.065860 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:17.065644 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:21:17.065860 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:17.065683 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 22 18:21:17.947561 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:17.947524 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" event={"ID":"d2f9b8f6-1e8c-4f74-aaab-966f318fa93c","Type":"ContainerStarted","Data":"f49ed9199fba9a5229fb753b85fd93dda623a373e114cebbad66a32fb74f4e96"} Apr 22 18:21:17.970371 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:17.969500 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" podStartSLOduration=1.499008783 podStartE2EDuration="3.969480836s" podCreationTimestamp="2026-04-22 18:21:14 +0000 UTC" firstStartedPulling="2026-04-22 18:21:14.594895424 +0000 UTC m=+1702.903526892" lastFinishedPulling="2026-04-22 18:21:17.065367481 +0000 UTC m=+1705.373998945" observedRunningTime="2026-04-22 18:21:17.966898444 +0000 UTC m=+1706.275529936" watchObservedRunningTime="2026-04-22 18:21:17.969480836 +0000 UTC m=+1706.278112324" Apr 22 18:21:18.385888 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:18.385854 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:18.387203 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:18.387173 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" podUID="d2f9b8f6-1e8c-4f74-aaab-966f318fa93c" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.40:15021/healthz/ready\": dial tcp 10.132.0.40:15021: connect: connection refused" Apr 22 18:21:19.386100 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:19.386060 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" podUID="d2f9b8f6-1e8c-4f74-aaab-966f318fa93c" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.132.0.40:15021/healthz/ready\": dial tcp 10.132.0.40:15021: connect: connection refused" Apr 22 18:21:20.390206 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:20.390178 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:20.957570 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:20.957536 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:20.958665 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:20.958643 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-78gbj" Apr 22 18:21:40.167020 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.166982 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:21:40.169970 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.169951 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.174814 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.174786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q5s78\"" Apr 22 18:21:40.175136 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.175122 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 18:21:40.176171 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.176154 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-trq4v\"" Apr 22 18:21:40.212415 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.212385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:21:40.268619 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.268813 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.268813 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.268813 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.268946 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7df7\" (UniqueName: \"kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.268946 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.268853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.346828 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.346798 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:21:40.353723 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.353699 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.369567 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.369541 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:21:40.370112 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370174 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370150 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370245 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370279 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370313 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370294 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370351 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370316 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370351 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370338 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370425 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-827r7\" (UniqueName: \"kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.370425 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370542 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370542 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7df7\" (UniqueName: \"kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370714 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370693 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370799 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.370875 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.370854 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.372402 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.372378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.372662 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.372644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.399978 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.399948 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7df7\" (UniqueName: \"kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7\") pod \"router-with-refs-pd-test-kserve-7b5546567b-qt4k9\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.471121 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471093 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471121 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471320 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-827r7\" (UniqueName: \"kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471412 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471393 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471469 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471436 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471522 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471478 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471584 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471530 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471722 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.471854 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.471835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.473541 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.473515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.473813 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.473796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.478811 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.478789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-827r7\" (UniqueName: \"kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7\") pod \"router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.478941 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.478929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:40.606134 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.606108 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:21:40.607638 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:21:40.607608 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc051005f_dc3d_4111_9591_79d4d2ec9875.slice/crio-d611353b8a05c52a378a9ceb1e25f007d17df0f0d1f42b477afc59faaccf7f3f WatchSource:0}: Error finding container d611353b8a05c52a378a9ceb1e25f007d17df0f0d1f42b477afc59faaccf7f3f: Status 404 returned error can't find the container with id d611353b8a05c52a378a9ceb1e25f007d17df0f0d1f42b477afc59faaccf7f3f Apr 22 18:21:40.663551 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.663519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:40.786087 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:40.786059 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:21:40.787517 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:21:40.787488 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod379f1de5_5df4_49f8_90b1_8f012c385f6f.slice/crio-055ed447271fa650014c3399478c5e9f483c2a2619d29af5cb904794c8e7e283 WatchSource:0}: Error finding container 055ed447271fa650014c3399478c5e9f483c2a2619d29af5cb904794c8e7e283: Status 404 returned error can't find the container with id 055ed447271fa650014c3399478c5e9f483c2a2619d29af5cb904794c8e7e283 Apr 22 18:21:41.022306 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:41.022205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerStarted","Data":"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520"} Apr 22 18:21:41.022306 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:41.022242 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerStarted","Data":"d611353b8a05c52a378a9ceb1e25f007d17df0f0d1f42b477afc59faaccf7f3f"} Apr 22 18:21:41.022306 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:41.022287 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:41.023710 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:41.023677 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerStarted","Data":"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00"} Apr 22 18:21:41.023867 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:41.023714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerStarted","Data":"055ed447271fa650014c3399478c5e9f483c2a2619d29af5cb904794c8e7e283"} Apr 22 18:21:42.030929 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:42.030892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerStarted","Data":"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab"} Apr 22 18:21:46.046679 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:46.046643 2568 generic.go:358] "Generic (PLEG): container finished" podID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerID="bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab" exitCode=0 Apr 22 18:21:46.047214 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:46.046718 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerDied","Data":"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab"} Apr 22 18:21:46.048077 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:46.048057 2568 generic.go:358] "Generic (PLEG): container finished" podID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerID="2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00" exitCode=0 Apr 22 18:21:46.048149 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:46.048102 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerDied","Data":"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00"} Apr 22 18:21:47.053803 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:47.053765 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerStarted","Data":"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa"} Apr 22 18:21:47.055515 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:47.055476 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerStarted","Data":"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b"} Apr 22 18:21:47.082365 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:47.082305 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podStartSLOduration=7.082285826 podStartE2EDuration="7.082285826s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:47.079079149 +0000 UTC m=+1735.387710636" watchObservedRunningTime="2026-04-22 18:21:47.082285826 +0000 UTC m=+1735.390917313" Apr 22 18:21:47.102983 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:47.102928 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podStartSLOduration=7.102915072 podStartE2EDuration="7.102915072s" podCreationTimestamp="2026-04-22 18:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:47.100590421 +0000 UTC m=+1735.409221932" watchObservedRunningTime="2026-04-22 18:21:47.102915072 +0000 UTC m=+1735.411546598" Apr 22 18:21:50.479700 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.479657 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:50.479700 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.479704 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:50.481245 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.481208 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:21:50.498600 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.498570 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:21:50.663932 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.663888 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:50.663932 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.663940 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:21:50.665702 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:21:50.665671 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:00.480283 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:00.480222 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:00.664770 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:00.664710 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:10.480232 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:10.480182 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:10.665013 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:10.664971 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:20.479615 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:20.479564 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:20.664366 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:20.664313 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:30.480009 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:30.479944 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:30.665031 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:30.664987 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:40.480295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:40.480238 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:40.664166 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:40.664122 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:50.479989 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:50.479938 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:22:50.663941 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:50.663896 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:22:52.236539 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:52.236511 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:22:52.238841 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:22:52.238816 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:23:00.479856 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:00.479806 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:00.664037 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:00.663986 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:23:10.480512 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:10.480446 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:10.664210 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:10.664149 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:23:20.480265 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:20.480208 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:20.664682 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:20.664638 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:23:30.480078 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:30.480022 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:30.664323 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:30.664277 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:23:40.480460 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:40.480418 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:40.664218 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:40.664168 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:23:50.479858 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:50.479809 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8001/health\": dial tcp 10.132.0.41:8001: connect: connection refused" Apr 22 18:23:50.664016 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:23:50.663972 2568 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" probeResult="failure" output="Get \"https://10.132.0.42:8000/health\": dial tcp 10.132.0.42:8000: connect: connection refused" Apr 22 18:24:00.489129 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:00.489095 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:24:00.511875 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:00.511848 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:24:00.673832 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:00.673799 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:24:00.681747 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:00.681712 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:24:12.564588 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:12.564510 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:24:12.565000 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:12.564978 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" containerID="cri-o://4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" gracePeriod=30 Apr 22 18:24:12.575821 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:12.575794 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:24:12.576152 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:12.576127 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" containerID="cri-o://bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b" gracePeriod=30 Apr 22 18:24:27.872417 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.872385 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:27.910409 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.910382 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:27.917576 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.917556 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:27.939227 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.939189 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:27.975459 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.975433 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:27.989297 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:27.989273 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:28.941649 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:28.941621 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:28.965879 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:28.965856 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:28.973015 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:28.972972 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:28.982327 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:28.982305 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:29.002660 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:29.002637 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:29.011564 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:29.011533 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:29.997971 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:29.997940 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:30.021695 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:30.021668 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:30.029472 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:30.029449 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:30.039072 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:30.039045 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:30.060141 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:30.060122 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:30.070844 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:30.070828 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:31.001424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.001393 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:31.038473 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.038452 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:31.045309 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.045289 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:31.055265 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.055227 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:31.073843 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.073822 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:31.080002 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:31.079981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:32.005407 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.005373 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:32.031007 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.030979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:32.041774 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.041755 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:32.051385 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.051365 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:32.069763 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.069743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:32.075766 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.075722 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:32.983643 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:32.983616 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:33.008203 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.008177 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:33.015907 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.015886 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:33.025808 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.025783 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:33.044660 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.044635 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:33.052521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.052504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:33.999496 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:33.999466 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:34.023996 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:34.023965 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:34.033315 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:34.033289 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:34.043312 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:34.043266 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:34.062882 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:34.062861 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:34.069976 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:34.069954 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:35.042674 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.042644 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:35.069110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.069088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:35.083149 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.083128 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:35.100716 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.100685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:35.128478 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.128448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:35.136576 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:35.136541 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:36.163321 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.163293 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:36.222110 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.222084 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:36.229064 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.229034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:36.238438 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.238416 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:36.256884 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.256862 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:36.266301 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:36.266279 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:37.265039 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.265005 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:37.291107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.291082 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:37.298323 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.298302 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:37.308272 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.308256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:37.330066 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.330037 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:37.339342 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:37.339279 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:38.390366 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.390325 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:38.415221 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.415190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:38.425065 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.425035 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:38.435605 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.435583 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:38.454666 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.454643 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:38.461937 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:38.461917 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:39.473694 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.473657 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:39.501052 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.501022 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:39.512748 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.512690 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:39.526940 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.526915 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:39.548985 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.548951 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:39.559577 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:39.559558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:40.595977 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.595944 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:40.621868 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.621837 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:40.647424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.647398 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:40.662457 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.662430 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:40.682242 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.682213 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:40.689040 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:40.689020 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:41.654426 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.654391 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-78gbj_d2f9b8f6-1e8c-4f74-aaab-966f318fa93c/istio-proxy/0.log" Apr 22 18:24:41.686979 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.686948 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:41.695022 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.694995 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/llm-d-routing-sidecar/0.log" Apr 22 18:24:41.705911 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.705884 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/storage-initializer/0.log" Apr 22 18:24:41.726692 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.726664 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/main/0.log" Apr 22 18:24:41.736130 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:41.736104 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg_379f1de5-5df4-49f8-90b1-8f012c385f6f/storage-initializer/0.log" Apr 22 18:24:42.565844 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.565802 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="llm-d-routing-sidecar" containerID="cri-o://bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" gracePeriod=2 Apr 22 18:24:42.806585 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.806546 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b747876cb-7f77q_224a42db-ff4d-4e18-a064-b7f2a7b10e91/router/0.log" Apr 22 18:24:42.831923 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.831897 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:42.832531 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.832513 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:24:42.835239 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.835222 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:24:42.959981 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.959939 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960188 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.959999 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960188 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960025 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-827r7\" (UniqueName: \"kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960188 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960056 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960188 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960077 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960188 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960163 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960220 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960260 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960310 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960339 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache" (OuterVolumeSpecName: "model-cache") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960344 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960392 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home" (OuterVolumeSpecName: "home") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960403 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home\") pod \"379f1de5-5df4-49f8-90b1-8f012c385f6f\" (UID: \"379f1de5-5df4-49f8-90b1-8f012c385f6f\") " Apr 22 18:24:42.960464 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960440 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7df7\" (UniqueName: \"kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7\") pod \"c051005f-dc3d-4111-9591-79d4d2ec9875\" (UID: \"c051005f-dc3d-4111-9591-79d4d2ec9875\") " Apr 22 18:24:42.960944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960582 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache" (OuterVolumeSpecName: "model-cache") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:42.960944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960783 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:42.960944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960801 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:42.960944 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.960813 2568 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-model-cache\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:42.961430 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.961175 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home" (OuterVolumeSpecName: "home") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:42.962822 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.962706 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:42.963005 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.962879 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:24:42.963005 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.962892 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm" (OuterVolumeSpecName: "dshm") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:42.963005 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.962911 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7" (OuterVolumeSpecName: "kube-api-access-827r7") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "kube-api-access-827r7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:42.963521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.963490 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7" (OuterVolumeSpecName: "kube-api-access-d7df7") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "kube-api-access-d7df7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:24:42.963521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:42.963511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm" (OuterVolumeSpecName: "dshm") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:43.020573 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.020522 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "379f1de5-5df4-49f8-90b1-8f012c385f6f" (UID: "379f1de5-5df4-49f8-90b1-8f012c385f6f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:43.021258 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.021235 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c051005f-dc3d-4111-9591-79d4d2ec9875" (UID: "c051005f-dc3d-4111-9591-79d4d2ec9875"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:24:43.061117 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061089 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061117 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061113 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-827r7\" (UniqueName: \"kubernetes.io/projected/379f1de5-5df4-49f8-90b1-8f012c385f6f-kube-api-access-827r7\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061117 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061123 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c051005f-dc3d-4111-9591-79d4d2ec9875-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061132 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/379f1de5-5df4-49f8-90b1-8f012c385f6f-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061142 2568 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-dshm\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061149 2568 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c051005f-dc3d-4111-9591-79d4d2ec9875-tls-certs\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061157 2568 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-kserve-provision-location\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061165 2568 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/379f1de5-5df4-49f8-90b1-8f012c385f6f-home\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.061360 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.061173 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7df7\" (UniqueName: \"kubernetes.io/projected/c051005f-dc3d-4111-9591-79d4d2ec9875-kube-api-access-d7df7\") on node \"ip-10-0-142-118.ec2.internal\" DevicePath \"\"" Apr 22 18:24:43.667280 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7b5546567b-qt4k9_c051005f-dc3d-4111-9591-79d4d2ec9875/main/0.log" Apr 22 18:24:43.667865 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667842 2568 generic.go:358] "Generic (PLEG): container finished" podID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerID="4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" exitCode=137 Apr 22 18:24:43.667865 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667864 2568 generic.go:358] "Generic (PLEG): container finished" podID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerID="bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" exitCode=0 Apr 22 18:24:43.667984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667914 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" Apr 22 18:24:43.667984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667922 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerDied","Data":"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa"} Apr 22 18:24:43.667984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667968 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerDied","Data":"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520"} Apr 22 18:24:43.667984 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667980 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9" event={"ID":"c051005f-dc3d-4111-9591-79d4d2ec9875","Type":"ContainerDied","Data":"d611353b8a05c52a378a9ceb1e25f007d17df0f0d1f42b477afc59faaccf7f3f"} Apr 22 18:24:43.668163 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.667995 2568 scope.go:117] "RemoveContainer" containerID="4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" Apr 22 18:24:43.669338 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.669319 2568 generic.go:358] "Generic (PLEG): container finished" podID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerID="bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b" exitCode=137 Apr 22 18:24:43.669424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.669353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerDied","Data":"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b"} Apr 22 18:24:43.669424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.669372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" event={"ID":"379f1de5-5df4-49f8-90b1-8f012c385f6f","Type":"ContainerDied","Data":"055ed447271fa650014c3399478c5e9f483c2a2619d29af5cb904794c8e7e283"} Apr 22 18:24:43.669424 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.669385 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg" Apr 22 18:24:43.688640 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.688624 2568 scope.go:117] "RemoveContainer" containerID="bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab" Apr 22 18:24:43.694570 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.694502 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b747876cb-7f77q_224a42db-ff4d-4e18-a064-b7f2a7b10e91/router/0.log" Apr 22 18:24:43.694904 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.694882 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:24:43.699403 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.699384 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7b5546567b-qt4k9"] Apr 22 18:24:43.699955 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.699938 2568 scope.go:117] "RemoveContainer" containerID="bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" Apr 22 18:24:43.706570 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.706553 2568 scope.go:117] "RemoveContainer" containerID="4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" Apr 22 18:24:43.706852 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:24:43.706834 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa\": container with ID starting with 4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa not found: ID does not exist" containerID="4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" Apr 22 18:24:43.706912 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.706861 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa"} err="failed to get container status \"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa\": rpc error: code = NotFound desc = could not find container \"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa\": container with ID starting with 4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa not found: ID does not exist" Apr 22 18:24:43.706912 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.706879 2568 scope.go:117] "RemoveContainer" containerID="bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab" Apr 22 18:24:43.707096 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:24:43.707082 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab\": container with ID starting with bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab not found: ID does not exist" containerID="bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab" Apr 22 18:24:43.707133 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707098 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab"} err="failed to get container status \"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab\": rpc error: code = NotFound desc = could not find container \"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab\": container with ID starting with bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab not found: ID does not exist" Apr 22 18:24:43.707133 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707114 2568 scope.go:117] "RemoveContainer" containerID="bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" Apr 22 18:24:43.707310 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:24:43.707291 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520\": container with ID starting with bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520 not found: ID does not exist" containerID="bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" Apr 22 18:24:43.707355 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707319 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520"} err="failed to get container status \"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520\": rpc error: code = NotFound desc = could not find container \"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520\": container with ID starting with bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520 not found: ID does not exist" Apr 22 18:24:43.707355 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707339 2568 scope.go:117] "RemoveContainer" containerID="4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa" Apr 22 18:24:43.707539 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707519 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa"} err="failed to get container status \"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa\": rpc error: code = NotFound desc = could not find container \"4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa\": container with ID starting with 4831bb4f08e0f4b954214f5a4b132a4f34fa8d8c3851c8274efaae14d75acefa not found: ID does not exist" Apr 22 18:24:43.707593 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707539 2568 scope.go:117] "RemoveContainer" containerID="bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab" Apr 22 18:24:43.707743 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707710 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab"} err="failed to get container status \"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab\": rpc error: code = NotFound desc = could not find container \"bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab\": container with ID starting with bc1dad4126300a78ba1050da804d33fe6438ef6b736eb9ac820f2205c95dbeab not found: ID does not exist" Apr 22 18:24:43.707801 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707788 2568 scope.go:117] "RemoveContainer" containerID="bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520" Apr 22 18:24:43.707960 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707939 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520"} err="failed to get container status \"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520\": rpc error: code = NotFound desc = could not find container \"bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520\": container with ID starting with bde834747e2d72079f61a89fc48399fd23f1b2bed34c13f6e1e48f971aee8520 not found: ID does not exist" Apr 22 18:24:43.707960 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.707958 2568 scope.go:117] "RemoveContainer" containerID="bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b" Apr 22 18:24:43.716866 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.716849 2568 scope.go:117] "RemoveContainer" containerID="2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00" Apr 22 18:24:43.724810 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.724788 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:24:43.726806 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.726786 2568 scope.go:117] "RemoveContainer" containerID="bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b" Apr 22 18:24:43.727057 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.727039 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-6f58f88dfc-gdgpg"] Apr 22 18:24:43.727116 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:24:43.727043 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b\": container with ID starting with bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b not found: ID does not exist" containerID="bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b" Apr 22 18:24:43.727116 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.727083 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b"} err="failed to get container status \"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b\": rpc error: code = NotFound desc = could not find container \"bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b\": container with ID starting with bf170687897d8b6929ccf1b8c8f82f0d9d19b52a9d42598dd69858eb6406fb8b not found: ID does not exist" Apr 22 18:24:43.727116 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.727097 2568 scope.go:117] "RemoveContainer" containerID="2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00" Apr 22 18:24:43.727324 ip-10-0-142-118 kubenswrapper[2568]: E0422 18:24:43.727309 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00\": container with ID starting with 2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00 not found: ID does not exist" containerID="2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00" Apr 22 18:24:43.727383 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:43.727327 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00"} err="failed to get container status \"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00\": rpc error: code = NotFound desc = could not find container \"2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00\": container with ID starting with 2999944f0bbe91cedb63092af548ebc68e729d92ae2e33ba1646b7b66e8b0d00 not found: ID does not exist" Apr 22 18:24:44.212453 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.212419 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" path="/var/lib/kubelet/pods/379f1de5-5df4-49f8-90b1-8f012c385f6f/volumes" Apr 22 18:24:44.212879 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.212863 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" path="/var/lib/kubelet/pods/c051005f-dc3d-4111-9591-79d4d2ec9875/volumes" Apr 22 18:24:44.488296 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.488188 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-2mdqj_69d51c27-f087-405f-99f9-cc012eb420cd/authorino/0.log" Apr 22 18:24:44.544851 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.544820 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-4psqn_090e3c02-96d2-479e-8871-e8358b3f1d4e/manager/0.log" Apr 22 18:24:44.561160 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.561135 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-6jf4s_eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c/kuadrant-console-plugin/0.log" Apr 22 18:24:44.627887 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:44.627852 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-sttvs_682a9f48-b938-41dd-8f69-4753f78876f5/limitador/0.log" Apr 22 18:24:49.845724 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:49.845686 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ldvlp_0289f618-f4aa-4688-a261-c755d1a71444/global-pull-secret-syncer/0.log" Apr 22 18:24:49.951506 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:49.951469 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ql9lr_1bfe7678-f24d-4f1f-81a3-b65e7179ae30/konnectivity-agent/0.log" Apr 22 18:24:50.047770 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:50.047723 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-118.ec2.internal_1efe39c18a96fb22c7e6fa00ec347d37/haproxy/0.log" Apr 22 18:24:54.003212 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:54.003162 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-2mdqj_69d51c27-f087-405f-99f9-cc012eb420cd/authorino/0.log" Apr 22 18:24:54.059537 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:54.059489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-4psqn_090e3c02-96d2-479e-8871-e8358b3f1d4e/manager/0.log" Apr 22 18:24:54.079123 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:54.079047 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-6jf4s_eb5a9fc9-46b6-459b-99df-2b4ffaa8e90c/kuadrant-console-plugin/0.log" Apr 22 18:24:54.157790 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:54.157762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-sttvs_682a9f48-b938-41dd-8f69-4753f78876f5/limitador/0.log" Apr 22 18:24:55.338207 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:55.338180 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-pmdgh_28c65550-3cca-4589-82a4-baaf985beda6/cluster-monitoring-operator/0.log" Apr 22 18:24:55.522227 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:55.522198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fvnc2_97f2c808-a28a-451a-ac4a-bd5f265698e3/node-exporter/0.log" Apr 22 18:24:55.544400 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:55.544373 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fvnc2_97f2c808-a28a-451a-ac4a-bd5f265698e3/kube-rbac-proxy/0.log" Apr 22 18:24:55.575070 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:55.575043 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fvnc2_97f2c808-a28a-451a-ac4a-bd5f265698e3/init-textfile/0.log" Apr 22 18:24:58.008606 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.008572 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/1.log" Apr 22 18:24:58.017350 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.017308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-7v4cv_0972f1d3-8168-44be-896c-c3d80cd4c9d7/console-operator/2.log" Apr 22 18:24:58.704342 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704305 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj"] Apr 22 18:24:58.704628 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704614 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="storage-initializer" Apr 22 18:24:58.704672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704630 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="storage-initializer" Apr 22 18:24:58.704672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704638 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" Apr 22 18:24:58.704672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704643 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" Apr 22 18:24:58.704672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704656 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="llm-d-routing-sidecar" Apr 22 18:24:58.704672 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704662 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="llm-d-routing-sidecar" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704676 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="storage-initializer" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704682 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="storage-initializer" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704689 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704694 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704757 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="379f1de5-5df4-49f8-90b1-8f012c385f6f" containerName="main" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704768 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="main" Apr 22 18:24:58.704845 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.704780 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c051005f-dc3d-4111-9591-79d4d2ec9875" containerName="llm-d-routing-sidecar" Apr 22 18:24:58.708151 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.708133 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.710671 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.710648 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"openshift-service-ca.crt\"" Apr 22 18:24:58.710783 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.710659 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kx68p\"/\"kube-root-ca.crt\"" Apr 22 18:24:58.711657 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.711635 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kx68p\"/\"default-dockercfg-c5qsr\"" Apr 22 18:24:58.716892 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.716870 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj"] Apr 22 18:24:58.894408 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.894378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-sys\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.894408 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.894413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbt8\" (UniqueName: \"kubernetes.io/projected/6a219692-d295-4b91-b922-2147ed710975-kube-api-access-kxbt8\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.894722 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.894441 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-podres\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.894722 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.894530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-lib-modules\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.894722 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.894596 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-proc\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.965605 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.965531 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vqhdv_418e6314-c842-4a4a-82f4-6daab5c36653/volume-data-source-validator/0.log" Apr 22 18:24:58.995918 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.995886 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-sys\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996071 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.995923 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbt8\" (UniqueName: \"kubernetes.io/projected/6a219692-d295-4b91-b922-2147ed710975-kube-api-access-kxbt8\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996071 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.995963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-podres\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996071 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-lib-modules\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996071 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-sys\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-podres\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-proc\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996118 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-lib-modules\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:58.996295 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:58.996139 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6a219692-d295-4b91-b922-2147ed710975-proc\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:59.005008 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.004985 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbt8\" (UniqueName: \"kubernetes.io/projected/6a219692-d295-4b91-b922-2147ed710975-kube-api-access-kxbt8\") pod \"perf-node-gather-daemonset-nwdnj\" (UID: \"6a219692-d295-4b91-b922-2147ed710975\") " pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:59.018892 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.018869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:59.143286 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.143259 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj"] Apr 22 18:24:59.144357 ip-10-0-142-118 kubenswrapper[2568]: W0422 18:24:59.144327 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6a219692_d295_4b91_b922_2147ed710975.slice/crio-898dea9c4dd5c02e4489343c39950340c9e654c9a32a938483cc682d21b76a43 WatchSource:0}: Error finding container 898dea9c4dd5c02e4489343c39950340c9e654c9a32a938483cc682d21b76a43: Status 404 returned error can't find the container with id 898dea9c4dd5c02e4489343c39950340c9e654c9a32a938483cc682d21b76a43 Apr 22 18:24:59.145922 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.145901 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:24:59.721094 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.721058 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" event={"ID":"6a219692-d295-4b91-b922-2147ed710975","Type":"ContainerStarted","Data":"1c1f760e29dc2a764aa446880f76c0952778a817aff862c1499372eac6660863"} Apr 22 18:24:59.721094 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.721098 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" event={"ID":"6a219692-d295-4b91-b922-2147ed710975","Type":"ContainerStarted","Data":"898dea9c4dd5c02e4489343c39950340c9e654c9a32a938483cc682d21b76a43"} Apr 22 18:24:59.721321 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.721128 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:24:59.738112 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.738060 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" podStartSLOduration=1.73804555 podStartE2EDuration="1.73804555s" podCreationTimestamp="2026-04-22 18:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:24:59.73629844 +0000 UTC m=+1928.044929927" watchObservedRunningTime="2026-04-22 18:24:59.73804555 +0000 UTC m=+1928.046677036" Apr 22 18:24:59.761651 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.761614 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zs6sw_8f0708c3-8b05-45e1-9d30-ca3772151671/dns/0.log" Apr 22 18:24:59.780011 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.779980 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zs6sw_8f0708c3-8b05-45e1-9d30-ca3772151671/kube-rbac-proxy/0.log" Apr 22 18:24:59.840130 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:24:59.840099 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sbk9w_3cc6474c-a1f9-41c8-9a45-6ec7dc3f52ca/dns-node-resolver/0.log" Apr 22 18:25:00.301399 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:00.301369 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tjtfp_8c0ae7fd-c205-4928-b51f-9f80202d3f77/node-ca/0.log" Apr 22 18:25:01.208122 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:01.208088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b747876cb-7f77q_224a42db-ff4d-4e18-a064-b7f2a7b10e91/router/0.log" Apr 22 18:25:01.641148 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:01.641067 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rnpt6_fa19e254-4e3d-4822-81d3-7ea095625185/serve-healthcheck-canary/0.log" Apr 22 18:25:02.049107 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:02.049075 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vkb44_75454aa1-9f9c-481a-b5e0-248d97ce5213/insights-operator/0.log" Apr 22 18:25:02.049901 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:02.049877 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vkb44_75454aa1-9f9c-481a-b5e0-248d97ce5213/insights-operator/1.log" Apr 22 18:25:02.119888 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:02.119862 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-77tjc_d3db4b4c-de2a-4504-ac79-1a6d59c9892e/kube-rbac-proxy/0.log" Apr 22 18:25:02.135180 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:02.135158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-77tjc_d3db4b4c-de2a-4504-ac79-1a6d59c9892e/exporter/0.log" Apr 22 18:25:02.153308 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:02.153273 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-77tjc_d3db4b4c-de2a-4504-ac79-1a6d59c9892e/extractor/0.log" Apr 22 18:25:04.684465 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:04.684431 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5b9bbc5c4d-cplp2_46198b9a-ac49-4f44-8d9c-fc29591ae093/manager/0.log" Apr 22 18:25:05.734642 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:05.734615 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kx68p/perf-node-gather-daemonset-nwdnj" Apr 22 18:25:10.467711 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:10.467679 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-kwxxz_99308cc1-5395-417c-bf2d-54fe0c5411d7/kube-storage-version-migrator-operator/1.log" Apr 22 18:25:10.469540 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:10.469518 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-kwxxz_99308cc1-5395-417c-bf2d-54fe0c5411d7/kube-storage-version-migrator-operator/0.log" Apr 22 18:25:11.371669 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.371632 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4ljm6_0bd9542b-a42c-4dbd-a379-4f7eea0a1ca3/kube-multus/0.log" Apr 22 18:25:11.712452 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.712424 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/kube-multus-additional-cni-plugins/0.log" Apr 22 18:25:11.732012 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.731981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/egress-router-binary-copy/0.log" Apr 22 18:25:11.748699 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.748676 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/cni-plugins/0.log" Apr 22 18:25:11.763676 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.763644 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/bond-cni-plugin/0.log" Apr 22 18:25:11.780446 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.780423 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/routeoverride-cni/0.log" Apr 22 18:25:11.794638 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.794620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/whereabouts-cni-bincopy/0.log" Apr 22 18:25:11.809824 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:11.809804 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s7s7v_e29ab8a7-8881-4951-93eb-55d0b996dbcb/whereabouts-cni/0.log" Apr 22 18:25:12.008157 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:12.008072 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7kpf_ab99124f-2959-4b17-ab76-24041f074fe5/network-metrics-daemon/0.log" Apr 22 18:25:12.022667 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:12.022640 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-k7kpf_ab99124f-2959-4b17-ab76-24041f074fe5/kube-rbac-proxy/0.log" Apr 22 18:25:13.358155 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.358130 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/ovn-controller/0.log" Apr 22 18:25:13.387133 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.387108 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/ovn-acl-logging/0.log" Apr 22 18:25:13.407562 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.407537 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/kube-rbac-proxy-node/0.log" Apr 22 18:25:13.424782 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.424756 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:25:13.447577 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.447553 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/northd/0.log" Apr 22 18:25:13.467560 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.467530 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/nbdb/0.log" Apr 22 18:25:13.487521 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.487502 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/sbdb/0.log" Apr 22 18:25:13.648657 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:13.648575 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fvkcv_b920c1ec-1c95-459e-a9cf-a36565ac5b48/ovnkube-controller/0.log" Apr 22 18:25:14.972758 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:14.972706 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-9wdzx_0b69db01-4663-4db0-84fe-b0eaeccdfb5a/check-endpoints/0.log" Apr 22 18:25:14.994549 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:14.994522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4phwt_d950d834-86a0-437a-b1c6-30e88678d30b/network-check-target-container/0.log" Apr 22 18:25:16.004419 ip-10-0-142-118 kubenswrapper[2568]: I0422 18:25:16.004389 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h4knh_d1e6d5b7-a3d0-4a7a-965b-c59191a9dbfd/iptables-alerter/0.log"