Apr 22 19:21:27.882815 ip-10-0-132-160 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 19:21:27.882827 ip-10-0-132-160 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 19:21:27.882834 ip-10-0-132-160 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 19:21:27.883043 ip-10-0-132-160 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 19:21:38.044914 ip-10-0-132-160 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 19:21:38.044929 ip-10-0-132-160 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7f92c72cbca3417ea084643c0e29a33c -- Apr 22 19:24:10.103789 ip-10-0-132-160 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:24:10.528468 ip-10-0-132-160 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:24:10.528468 ip-10-0-132-160 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:24:10.528468 ip-10-0-132-160 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:24:10.528468 ip-10-0-132-160 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:24:10.528468 ip-10-0-132-160 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:24:10.529868 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.529779 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:24:10.531980 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531965 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531981 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531986 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531990 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531993 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531996 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.531999 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532001 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532004 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532007 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532010 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532013 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532016 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532019 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:24:10.532015 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532022 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532025 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532028 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532035 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532038 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532041 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532043 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532046 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532049 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532051 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532054 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532056 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532059 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532061 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532065 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532067 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532070 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532074 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532078 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:24:10.532385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532081 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532084 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532086 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532089 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532092 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532094 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532097 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532099 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532102 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532104 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532107 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532110 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532112 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532115 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532118 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532121 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532124 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532126 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532129 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532132 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:24:10.532883 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532135 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532138 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532140 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532143 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532145 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532148 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532151 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532153 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532156 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532158 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532161 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532163 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532167 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532169 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532172 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532175 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532178 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532180 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532183 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532185 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:24:10.533363 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532188 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532191 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532214 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532218 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532221 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532224 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532226 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532230 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532233 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532236 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532240 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532242 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.532245 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533081 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533088 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533091 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533094 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533098 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533101 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533104 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:24:10.533882 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533107 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533109 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533112 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533115 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533117 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533120 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533123 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533125 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533128 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533130 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533133 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533136 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533139 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533142 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533145 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533147 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533150 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533152 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533155 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533158 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:24:10.534367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533161 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533164 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533166 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533169 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533172 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533175 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533177 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533180 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533182 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533185 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533188 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533191 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533193 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533196 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533198 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533201 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533204 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533206 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533209 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:24:10.534918 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533211 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533214 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533216 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533219 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533221 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533224 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533227 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533231 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533234 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533237 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533240 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533243 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533247 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533250 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533253 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533255 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533258 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533260 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533263 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533265 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:24:10.535385 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533268 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533270 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533273 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533275 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533278 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533280 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533283 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533288 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533290 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533294 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533298 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533301 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533304 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533307 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533309 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533312 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533314 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533317 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.533320 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:24:10.535896 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534362 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534371 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534378 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534383 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534388 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534393 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534397 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534403 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534406 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534410 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534413 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534417 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534420 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534423 2576 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534426 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534429 2576 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534432 2576 flags.go:64] FLAG: --cloud-config="" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534435 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534438 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534442 2576 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534445 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534448 2576 flags.go:64] FLAG: --config-dir="" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534451 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534454 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:24:10.536399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534458 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534461 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534465 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534468 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534471 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534475 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534477 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534481 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534484 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534489 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534492 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534494 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534510 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534514 2576 flags.go:64] FLAG: --enable-server="true" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534517 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534522 2576 flags.go:64] FLAG: --event-burst="100" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534525 2576 flags.go:64] FLAG: --event-qps="50" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534528 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534531 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534534 2576 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534538 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534541 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534545 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534548 2576 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534551 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:24:10.536995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534554 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534557 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534559 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534562 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534566 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534569 2576 flags.go:64] FLAG: --feature-gates="" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534573 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534576 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534579 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534583 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534586 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534589 2576 flags.go:64] FLAG: --help="false" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534592 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534595 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534598 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534601 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534605 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534609 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534612 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534615 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534618 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534621 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534624 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534627 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:24:10.537594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534630 2576 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534634 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534636 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534640 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534642 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534645 2576 flags.go:64] FLAG: --lock-file="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534648 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534651 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534654 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534660 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534663 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534666 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534670 2576 flags.go:64] FLAG: --logging-format="text" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534673 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534676 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534679 2576 flags.go:64] FLAG: --manifest-url="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534683 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534687 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534691 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534695 2576 flags.go:64] FLAG: --max-pods="110" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534698 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534701 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534704 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534707 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534711 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:24:10.538168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534714 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534717 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534724 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534727 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534730 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534733 2576 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534736 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534741 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534744 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534747 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534751 2576 flags.go:64] FLAG: --port="10250" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534754 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534757 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ce4673f95cddd505" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534760 2576 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534763 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534766 2576 flags.go:64] FLAG: --register-node="true" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534769 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534772 2576 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534776 2576 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534779 2576 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534785 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534787 2576 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534792 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534795 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534798 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:24:10.538789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534801 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534804 2576 flags.go:64] FLAG: --runonce="false" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534807 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534810 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534813 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534816 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534819 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534823 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534826 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534829 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534832 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534835 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534838 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534841 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534844 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534847 2576 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534850 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534855 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534858 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534861 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534865 2576 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534868 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534871 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534874 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534876 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:24:10.539407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534880 2576 flags.go:64] FLAG: --v="2" Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534887 2576 flags.go:64] FLAG: --version="false" Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534893 2576 flags.go:64] FLAG: --vmodule="" Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534897 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.534900 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.534991 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.534995 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.534998 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535001 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535004 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535007 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535009 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535012 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535015 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535018 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535021 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535024 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535027 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535029 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535032 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535035 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:24:10.540022 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535037 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535040 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535042 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535045 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535048 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535050 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535052 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535055 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535058 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535060 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535063 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535066 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535068 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535072 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535075 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535078 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535080 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535083 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535086 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535088 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:24:10.540621 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535091 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535094 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535096 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535099 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535102 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535104 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535107 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535110 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535112 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535115 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535117 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535120 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535123 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535125 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535128 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535131 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535133 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535136 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535138 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:24:10.541156 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535142 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535146 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535149 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535152 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535155 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535158 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535162 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535165 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535167 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535170 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535172 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535175 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535178 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535180 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535183 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535185 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535188 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535192 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535195 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:24:10.541650 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535198 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535201 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535204 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535207 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535209 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535212 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535214 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535217 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535220 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535222 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535225 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.535228 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:24:10.542112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.535928 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.542332 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.542348 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542396 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542401 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542405 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542408 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542411 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:24:10.542410 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542413 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542416 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542419 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542422 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542425 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542427 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542430 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542432 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542435 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542438 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542441 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542443 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542446 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542449 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542451 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542454 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542456 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542459 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:24:10.542630 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542464 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542467 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542470 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542472 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542474 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542477 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542479 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542488 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542492 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542496 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542517 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542520 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542523 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542526 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542530 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542534 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542538 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542541 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542544 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:24:10.543091 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542546 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542549 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542552 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542554 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542557 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542559 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542562 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542565 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542567 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542570 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542572 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542575 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542578 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542581 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542584 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542586 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542589 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542591 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542594 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542596 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:24:10.543618 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542598 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542607 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542610 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542612 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542615 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542617 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542620 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542623 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542626 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542628 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542631 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542634 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542637 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542640 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542642 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542645 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542648 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542650 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542653 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542656 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:24:10.544106 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542658 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542661 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542663 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.542668 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542791 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542796 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542799 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542802 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542805 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542808 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542811 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542814 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542816 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542819 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542829 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:24:10.544690 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542833 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542835 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542838 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542841 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542844 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542848 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542851 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542854 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542857 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542859 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542862 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542864 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542867 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542869 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542872 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542874 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542877 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542879 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542882 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542884 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:24:10.545077 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542887 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542890 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542892 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542894 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542898 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542901 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542904 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542907 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542909 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542912 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542914 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542917 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542924 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542927 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542930 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542932 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542935 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542938 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542940 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:24:10.545591 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542943 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542945 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542948 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542950 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542953 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542955 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542958 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542960 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542963 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542965 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542967 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542970 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542972 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542975 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542977 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542980 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542982 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542985 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542987 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542990 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:24:10.546079 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542992 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542995 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.542997 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543000 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543002 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543004 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543012 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543015 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543017 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543020 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543029 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543032 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543035 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543037 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543040 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:10.543042 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:24:10.546683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.543047 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:24:10.547085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.543608 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:24:10.547085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.545432 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:24:10.547085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.546292 2576 server.go:1019] "Starting client certificate rotation" Apr 22 19:24:10.547085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.546385 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:24:10.547085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.546427 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:24:10.569181 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.569162 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:24:10.571832 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.571795 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:24:10.586511 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.586476 2576 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:24:10.591864 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.591849 2576 log.go:25] "Validated CRI v1 image API" Apr 22 19:24:10.593094 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.593077 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:24:10.597650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.597627 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:24:10.598469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.598446 2576 fs.go:135] Filesystem UUIDs: map[05946f86-5e75-4a96-a999-ea41460b86b0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 d9ddd532-ab5f-4471-bd6a-c2dbdbf565f7:/dev/nvme0n1p4] Apr 22 19:24:10.598568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.598466 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:24:10.604232 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.604123 2576 manager.go:217] Machine: {Timestamp:2026-04-22 19:24:10.602114893 +0000 UTC m=+0.387104756 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097251 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e04ae42dc8bcccf02c818ac69a852 SystemUUID:ec2e04ae-42dc-8bcc-cf02-c818ac69a852 BootID:7f92c72c-bca3-417e-a084-643c0e29a33c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ec:56:93:e7:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ec:56:93:e7:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:d5:bb:3d:ad:0a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:24:10.604314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.604226 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:24:10.604347 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.604325 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:24:10.605371 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.605341 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:24:10.605559 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.605373 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-160.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:24:10.605659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.605573 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:24:10.605659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.605586 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:24:10.605659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.605604 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:24:10.606277 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.606265 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:24:10.607374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.607361 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:24:10.607525 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.607514 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:24:10.609627 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.609615 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:24:10.609692 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.609633 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:24:10.609692 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.609651 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:24:10.609692 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.609666 2576 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:24:10.609692 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.609679 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:24:10.610763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.610750 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:24:10.610827 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.610774 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:24:10.613384 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.613370 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:24:10.614594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.614578 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:24:10.616044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616029 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616051 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616061 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616069 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616078 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616087 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616096 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616105 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616114 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:24:10.616125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616124 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:24:10.616408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616144 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:24:10.616408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616159 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:24:10.616943 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616931 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:24:10.617005 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.616946 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:24:10.620586 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.620572 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:24:10.620673 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.620615 2576 server.go:1295] "Started kubelet" Apr 22 19:24:10.620725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.620693 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:24:10.620835 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.620793 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:24:10.620875 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.620853 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:24:10.621945 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.621917 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8jlw" Apr 22 19:24:10.622144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.622130 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:24:10.622280 ip-10-0-132-160 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:24:10.622940 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.622836 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-160.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:24:10.623044 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.623017 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:24:10.623098 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.623080 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:24:10.624487 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.624469 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:24:10.630403 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.629565 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-160.ec2.internal.18a8c4444ca85e85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-160.ec2.internal,UID:ip-10-0-132-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-160.ec2.internal,},FirstTimestamp:2026-04-22 19:24:10.620583557 +0000 UTC m=+0.405573425,LastTimestamp:2026-04-22 19:24:10.620583557 +0000 UTC m=+0.405573425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-160.ec2.internal,}" Apr 22 19:24:10.630495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.630437 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:24:10.630495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.630473 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:24:10.630730 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.630700 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:24:10.631064 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631043 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:24:10.631164 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631146 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:24:10.631228 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631186 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:24:10.631278 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.631240 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:10.631324 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631316 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:24:10.631324 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631324 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:24:10.631419 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631412 2576 factory.go:55] Registering systemd factory Apr 22 19:24:10.631454 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631439 2576 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:24:10.631760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631746 2576 factory.go:153] Registering CRI-O factory Apr 22 19:24:10.631842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631762 2576 factory.go:223] Registration of the crio container factory successfully Apr 22 19:24:10.631842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631809 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:24:10.631842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631834 2576 factory.go:103] Registering Raw factory Apr 22 19:24:10.631975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.631849 2576 manager.go:1196] Started watching for new ooms in manager Apr 22 19:24:10.632207 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.632193 2576 manager.go:319] Starting recovery of all containers Apr 22 19:24:10.633490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.632723 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-l8jlw" Apr 22 19:24:10.637085 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.636933 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:24:10.637289 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.637267 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:24:10.641070 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.641055 2576 manager.go:324] Recovery completed Apr 22 19:24:10.645162 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.645149 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.647484 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.647467 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.647585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.647513 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.647585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.647529 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.648018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.648004 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:24:10.648018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.648017 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:24:10.648126 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.648034 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:24:10.651036 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.651022 2576 policy_none.go:49] "None policy: Start" Apr 22 19:24:10.651111 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.651040 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:24:10.651111 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.651053 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695062 2576 manager.go:341] "Starting Device Plugin manager" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.695098 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695108 2576 server.go:85] "Starting device plugin registration server" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695330 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695343 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695435 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695535 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.695545 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.695995 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:24:10.708018 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.696025 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:10.748290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.748263 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:24:10.749421 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.749404 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:24:10.749485 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.749434 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:24:10.749485 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.749454 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:24:10.749485 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.749462 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:24:10.749625 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.749492 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:24:10.752379 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.752348 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:24:10.795881 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.795836 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.796939 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.796924 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.797012 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.796951 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.797012 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.796960 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.797012 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.796982 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.805347 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.805332 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.805391 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.805353 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-160.ec2.internal\": node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:10.832496 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.832472 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:10.850481 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.850459 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal"] Apr 22 19:24:10.850562 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.850536 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.851450 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.851431 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.851542 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.851462 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.851542 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.851473 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.853858 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.853845 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.853966 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.853949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.854003 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.853981 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.854565 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854548 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.854636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854579 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.854636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854589 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.854636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854548 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.854715 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854650 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.854715 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.854662 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.856872 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.856859 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.856959 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.856882 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:24:10.857476 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.857463 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:24:10.857563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.857492 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:24:10.857563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.857516 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:24:10.870053 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.870035 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-160.ec2.internal\" not found" node="ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.873981 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.873967 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-160.ec2.internal\" not found" node="ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.932697 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:10.932659 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:10.932804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.932716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.932804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.932742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:10.932804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:10.932759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033364 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.033336 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.033460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033627 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.033627 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.033487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.134114 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.134046 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.172241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.172219 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.176650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.176631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.234353 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.234310 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.334732 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.334689 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.435167 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.435095 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.535383 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.535354 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.546538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.546497 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:24:11.546654 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.546638 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:24:11.631584 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.631559 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:24:11.634850 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.634820 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:19:10 +0000 UTC" deadline="2027-12-30 03:31:22.384666624 +0000 UTC" Apr 22 19:24:11.634850 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.634847 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14792h7m10.749823169s" Apr 22 19:24:11.636435 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.636417 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.636887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.636870 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:24:11.640801 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.640784 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:24:11.661761 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.661744 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qdrvt" Apr 22 19:24:11.670349 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.670324 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qdrvt" Apr 22 19:24:11.736985 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.736925 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.780105 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:11.780059 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206ff994154571336dcc99880b36f4f2.slice/crio-f5be60aa3319521c2e9b4817ba4feeba00fb2116fc73c11d5121263912caf6fb WatchSource:0}: Error finding container f5be60aa3319521c2e9b4817ba4feeba00fb2116fc73c11d5121263912caf6fb: Status 404 returned error can't find the container with id f5be60aa3319521c2e9b4817ba4feeba00fb2116fc73c11d5121263912caf6fb Apr 22 19:24:11.780697 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:11.780670 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0393c54dc21367a93a647d0297c0f90.slice/crio-c0150bbfd0e82cdaed6b2263b9cbcd9015317f85c939e88da5c702c7447cbf96 WatchSource:0}: Error finding container c0150bbfd0e82cdaed6b2263b9cbcd9015317f85c939e88da5c702c7447cbf96: Status 404 returned error can't find the container with id c0150bbfd0e82cdaed6b2263b9cbcd9015317f85c939e88da5c702c7447cbf96 Apr 22 19:24:11.784422 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.784408 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:24:11.837833 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:11.837803 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 22 19:24:11.861372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.861346 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:24:11.873735 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.873720 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:24:11.931660 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.931639 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.943343 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.943325 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:24:11.944619 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.944607 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 22 19:24:11.950969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:11.950956 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:24:12.610928 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.610889 2576 apiserver.go:52] "Watching apiserver" Apr 22 19:24:12.618300 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.618275 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:24:12.619447 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.619417 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-frj6d","openshift-image-registry/node-ca-klmr4","openshift-multus/multus-additional-cni-plugins-4q6q8","openshift-multus/multus-prwb9","openshift-multus/network-metrics-daemon-rqq85","openshift-network-diagnostics/network-check-target-sxrzv","kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw","openshift-cluster-node-tuning-operator/tuned-qjfjk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal","openshift-network-operator/iptables-alerter-l2kj9","openshift-ovn-kubernetes/ovnkube-node-9f6tl","kube-system/konnectivity-agent-fkhrk"] Apr 22 19:24:12.622128 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.622107 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.624432 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.624242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.624916 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.624894 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.625043 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625005 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:24:12.625126 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:24:12.625761 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625738 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.625856 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:24:12.625856 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l955b\"" Apr 22 19:24:12.625856 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.625744 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:24:12.626663 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.626529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.627678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.627441 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.627678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.627473 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.627678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.627512 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-v5nhj\"" Apr 22 19:24:12.627678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.627446 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.628664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.628810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.629075 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.629109 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.629128 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wbbcm\"" Apr 22 19:24:12.629234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.629075 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:24:12.632593 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.632575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.635152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.635133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vqgbg\"" Apr 22 19:24:12.635473 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.635341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:24:12.635587 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.635536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.635646 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.635619 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:12.637793 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.637772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:12.637885 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.637833 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:12.640159 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.640139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.640278 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.640258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.642434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642359 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.642434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642360 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g6d9q\"" Apr 22 19:24:12.642652 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642463 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.642777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-97wkg\"" Apr 22 19:24:12.642858 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:24:12.642962 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.642942 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.643084 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-host\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.643178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-bin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-kubelet\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.643178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg7t6\" (UniqueName: \"kubernetes.io/projected/342209dc-2f51-4fc2-a96f-a19424f86d57-kube-api-access-mg7t6\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-log-socket\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfnl\" (UniqueName: \"kubernetes.io/projected/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-kube-api-access-9mfnl\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-cnibin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-etc-kubernetes\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-etc-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643376 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a50980a-3501-4203-96f8-93510d032673-ovn-node-metrics-cert\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-script-lib\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cnibin\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-system-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-os-release\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-netns\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-kubelet\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-systemd-units\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-netns\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-var-lib-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.643742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-ovn\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-netd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-env-overrides\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzfsr\" (UniqueName: \"kubernetes.io/projected/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-kube-api-access-gzfsr\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-socket-dir-parent\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-k8s-cni-cncf-io\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-multus\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.643995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-hostroot\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-conf-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-system-cni-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-systemd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-node-log\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-config\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-serviceca\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-daemon-config\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-multus-certs\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d49bn\" (UniqueName: \"kubernetes.io/projected/357c3ccf-e489-43f2-ae48-f23177ef4481-kube-api-access-d49bn\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-slash\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-bin\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9mrk\" (UniqueName: \"kubernetes.io/projected/3a50980a-3501-4203-96f8-93510d032673-kube-api-access-r9mrk\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-os-release\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.644770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.644536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-cni-binary-copy\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.645372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.645348 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.645739 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.645720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.647719 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.647698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648536 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9b8wr\"" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648784 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648787 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wm46v\"" Apr 22 19:24:12.649397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.648909 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:24:12.649848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.649829 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:24:12.650114 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.650058 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:24:12.650114 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.650062 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-xnlqg\"" Apr 22 19:24:12.650491 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.650424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:24:12.671367 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.671079 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:19:11 +0000 UTC" deadline="2027-10-06 02:49:13.501857755 +0000 UTC" Apr 22 19:24:12.671367 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.671107 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12751h25m0.830755727s" Apr 22 19:24:12.732250 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.732219 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:24:12.745165 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4mpq\" (UniqueName: \"kubernetes.io/projected/32f2d314-d41f-4b5f-990f-0750a69d5f47-kube-api-access-g4mpq\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.745307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-tmp\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.745307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-serviceca\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.745307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-host\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.745307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-bin\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9mrk\" (UniqueName: \"kubernetes.io/projected/3a50980a-3501-4203-96f8-93510d032673-kube-api-access-r9mrk\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-os-release\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-bin\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-registration-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-var-lib-kubelet\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-run\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-tuned\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfnl\" (UniqueName: \"kubernetes.io/projected/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-kube-api-access-9mfnl\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-etc-kubernetes\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-hosts-file\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-os-release\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnwl\" (UniqueName: \"kubernetes.io/projected/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kube-api-access-rrnwl\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.746372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-lib-modules\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.745760 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-script-lib\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cnibin\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.745870 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:13.245816353 +0000 UTC m=+3.030806219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745893 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-serviceca\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-os-release\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-systemd-units\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cnibin\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-netns\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-systemd-units\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-binary-copy\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.745962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-netns\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-ovn\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-etc-kubernetes\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-os-release\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.747231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-conf-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-ovn\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-systemd\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-conf-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-socket-dir-parent\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-k8s-cni-cncf-io\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-socket-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-socket-dir-parent\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-agent-certs\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-k8s-cni-cncf-io\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32f2d314-d41f-4b5f-990f-0750a69d5f47-iptables-alerter-script\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-kubernetes\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-systemd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-node-log\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-script-lib\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-config\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-daemon-config\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-run-systemd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-node-log\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-run-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-multus-certs\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d49bn\" (UniqueName: \"kubernetes.io/projected/357c3ccf-e489-43f2-ae48-f23177ef4481-kube-api-access-d49bn\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxz5t\" (UniqueName: \"kubernetes.io/projected/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-kube-api-access-lxz5t\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-slash\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-cni-binary-copy\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-host\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-bin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.748825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-kubelet\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg7t6\" (UniqueName: \"kubernetes.io/projected/342209dc-2f51-4fc2-a96f-a19424f86d57-kube-api-access-mg7t6\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.746981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-tmp-dir\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysconfig\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-multus-certs\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-ovnkube-config\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-slash\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-multus-daemon-config\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-kubelet\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-bin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-host\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-log-socket\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-cnibin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-log-socket\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-sys-fs\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-sys\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/357c3ccf-e489-43f2-ae48-f23177ef4481-cni-binary-copy\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.749590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-etc-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-cnibin\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a50980a-3501-4203-96f8-93510d032673-ovn-node-metrics-cert\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-etc-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-system-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-netns\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f2d314-d41f-4b5f-990f-0750a69d5f47-host-slash\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747905 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-modprobe-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-run-netns\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-system-cni-dir\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747928 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-conf\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-kubelet\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.747995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-var-lib-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-netd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-kubelet\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748038 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-cni-netd\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.750355 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-var-lib-openvswitch\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-env-overrides\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzfsr\" (UniqueName: \"kubernetes.io/projected/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-kube-api-access-gzfsr\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-multus\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3a50980a-3501-4203-96f8-93510d032673-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-hostroot\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-hostroot\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/357c3ccf-e489-43f2-ae48-f23177ef4481-host-var-lib-cni-multus\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-device-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-konnectivity-ca\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2tg\" (UniqueName: \"kubernetes.io/projected/e93892c7-b516-4779-8a8c-bfdaed0ccee1-kube-api-access-6f2tg\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-system-cni-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-etc-selinux\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.748414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-system-cni-dir\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.751208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.749111 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3a50980a-3501-4203-96f8-93510d032673-env-overrides\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.751992 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.751971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3a50980a-3501-4203-96f8-93510d032673-ovn-node-metrics-cert\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.754738 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.754663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" event={"ID":"206ff994154571336dcc99880b36f4f2","Type":"ContainerStarted","Data":"f5be60aa3319521c2e9b4817ba4feeba00fb2116fc73c11d5121263912caf6fb"} Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.756382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfnl\" (UniqueName: \"kubernetes.io/projected/c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a-kube-api-access-9mfnl\") pod \"node-ca-klmr4\" (UID: \"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a\") " pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.756919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerStarted","Data":"c0150bbfd0e82cdaed6b2263b9cbcd9015317f85c939e88da5c702c7447cbf96"} Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.756989 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.757008 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.757020 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:12.757143 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:12.757077 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:13.257060251 +0000 UTC m=+3.042050114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:12.757430 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.757227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d49bn\" (UniqueName: \"kubernetes.io/projected/357c3ccf-e489-43f2-ae48-f23177ef4481-kube-api-access-d49bn\") pod \"multus-prwb9\" (UID: \"357c3ccf-e489-43f2-ae48-f23177ef4481\") " pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.758078 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.758007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9mrk\" (UniqueName: \"kubernetes.io/projected/3a50980a-3501-4203-96f8-93510d032673-kube-api-access-r9mrk\") pod \"ovnkube-node-9f6tl\" (UID: \"3a50980a-3501-4203-96f8-93510d032673\") " pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.759237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.759197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg7t6\" (UniqueName: \"kubernetes.io/projected/342209dc-2f51-4fc2-a96f-a19424f86d57-kube-api-access-mg7t6\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:12.760744 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.760723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzfsr\" (UniqueName: \"kubernetes.io/projected/b2d3136a-c73d-4ecd-a4e9-a9c3c3605915-kube-api-access-gzfsr\") pod \"multus-additional-cni-plugins-4q6q8\" (UID: \"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915\") " pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.849042 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxz5t\" (UniqueName: \"kubernetes.io/projected/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-kube-api-access-lxz5t\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.849208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-tmp-dir\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.849208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysconfig\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-sys-fs\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysconfig\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-sys\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f2d314-d41f-4b5f-990f-0750a69d5f47-host-slash\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-modprobe-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-sys-fs\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-conf\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-sys\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-device-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/32f2d314-d41f-4b5f-990f-0750a69d5f47-host-slash\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-konnectivity-ca\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.849404 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-tmp-dir\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-modprobe-d\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2tg\" (UniqueName: \"kubernetes.io/projected/e93892c7-b516-4779-8a8c-bfdaed0ccee1-kube-api-access-6f2tg\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-sysctl-conf\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-device-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-etc-selinux\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-etc-selinux\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4mpq\" (UniqueName: \"kubernetes.io/projected/32f2d314-d41f-4b5f-990f-0750a69d5f47-kube-api-access-g4mpq\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-tmp\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-host\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-registration-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-var-lib-kubelet\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-host\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-run\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-tuned\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-hosts-file\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.849968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnwl\" (UniqueName: \"kubernetes.io/projected/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kube-api-access-rrnwl\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-lib-modules\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-registration-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-konnectivity-ca\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.849969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-var-lib-kubelet\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-run\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-hosts-file\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-systemd\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-socket-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-agent-certs\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32f2d314-d41f-4b5f-990f-0750a69d5f47-iptables-alerter-script\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-kubernetes\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-kubernetes\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-systemd\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e93892c7-b516-4779-8a8c-bfdaed0ccee1-lib-modules\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.850680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.850362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-socket-dir\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.851328 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.851191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/32f2d314-d41f-4b5f-990f-0750a69d5f47-iptables-alerter-script\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.852195 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.852173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-tmp\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.852314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.852223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e93892c7-b516-4779-8a8c-bfdaed0ccee1-etc-tuned\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.853101 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.853066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/388a5b94-24f5-48ff-aa37-825c8e5a0b2a-agent-certs\") pod \"konnectivity-agent-fkhrk\" (UID: \"388a5b94-24f5-48ff-aa37-825c8e5a0b2a\") " pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:12.858255 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.858232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxz5t\" (UniqueName: \"kubernetes.io/projected/9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6-kube-api-access-lxz5t\") pod \"node-resolver-frj6d\" (UID: \"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6\") " pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.858805 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.858781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2tg\" (UniqueName: \"kubernetes.io/projected/e93892c7-b516-4779-8a8c-bfdaed0ccee1-kube-api-access-6f2tg\") pod \"tuned-qjfjk\" (UID: \"e93892c7-b516-4779-8a8c-bfdaed0ccee1\") " pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:12.861249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.861166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnwl\" (UniqueName: \"kubernetes.io/projected/cb824fe2-b0a1-4056-99d2-0bf9b49c0d13-kube-api-access-rrnwl\") pod \"aws-ebs-csi-driver-node-2qrtw\" (UID: \"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.861618 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.861593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4mpq\" (UniqueName: \"kubernetes.io/projected/32f2d314-d41f-4b5f-990f-0750a69d5f47-kube-api-access-g4mpq\") pod \"iptables-alerter-l2kj9\" (UID: \"32f2d314-d41f-4b5f-990f-0750a69d5f47\") " pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:12.943850 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.943816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:12.944005 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.943951 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:24:12.952128 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.952067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-klmr4" Apr 22 19:24:12.962579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.962557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" Apr 22 19:24:12.968348 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.968323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-prwb9" Apr 22 19:24:12.976893 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.976873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-frj6d" Apr 22 19:24:12.985717 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.985679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" Apr 22 19:24:12.993554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:12.993531 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:13.001281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.001258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" Apr 22 19:24:13.007888 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.007851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-l2kj9" Apr 22 19:24:13.252877 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.252793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:13.253037 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.252955 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:13.253037 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.253029 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:14.253012297 +0000 UTC m=+4.038002151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:13.354006 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.353965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:13.354181 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.354153 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:13.354181 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.354180 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:13.354276 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.354193 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:13.354276 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:13.354257 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:14.354239358 +0000 UTC m=+4.139229210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:13.454662 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.454632 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93892c7_b516_4779_8a8c_bfdaed0ccee1.slice/crio-3f8c77dbcb6cb9a6d940b2fbc772c26223cbe4820218449ec0693b3f056961b5 WatchSource:0}: Error finding container 3f8c77dbcb6cb9a6d940b2fbc772c26223cbe4820218449ec0693b3f056961b5: Status 404 returned error can't find the container with id 3f8c77dbcb6cb9a6d940b2fbc772c26223cbe4820218449ec0693b3f056961b5 Apr 22 19:24:13.456202 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.456119 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388a5b94_24f5_48ff_aa37_825c8e5a0b2a.slice/crio-91b0bdc931e34d96808f860ba5a8c406abf8624b2f5e4b40c8204df92064c391 WatchSource:0}: Error finding container 91b0bdc931e34d96808f860ba5a8c406abf8624b2f5e4b40c8204df92064c391: Status 404 returned error can't find the container with id 91b0bdc931e34d96808f860ba5a8c406abf8624b2f5e4b40c8204df92064c391 Apr 22 19:24:13.459603 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.459514 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb824fe2_b0a1_4056_99d2_0bf9b49c0d13.slice/crio-fe60045c4bc116eeaa4d1994e1868be202e20537faa3a153fcf44b8520239874 WatchSource:0}: Error finding container fe60045c4bc116eeaa4d1994e1868be202e20537faa3a153fcf44b8520239874: Status 404 returned error can't find the container with id fe60045c4bc116eeaa4d1994e1868be202e20537faa3a153fcf44b8520239874 Apr 22 19:24:13.460343 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.460320 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0eeacf_656e_4f8f_aa52_91ca36e9a6b6.slice/crio-0e6c614401a328519e6a73b43e178f359104910f0a9caa18b4f50d31f4ef48bb WatchSource:0}: Error finding container 0e6c614401a328519e6a73b43e178f359104910f0a9caa18b4f50d31f4ef48bb: Status 404 returned error can't find the container with id 0e6c614401a328519e6a73b43e178f359104910f0a9caa18b4f50d31f4ef48bb Apr 22 19:24:13.461136 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.461118 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a50980a_3501_4203_96f8_93510d032673.slice/crio-525bb7735f49d26587e666267863f7d7ac49539b2dcac6cdaf3b21ea19c4cfc9 WatchSource:0}: Error finding container 525bb7735f49d26587e666267863f7d7ac49539b2dcac6cdaf3b21ea19c4cfc9: Status 404 returned error can't find the container with id 525bb7735f49d26587e666267863f7d7ac49539b2dcac6cdaf3b21ea19c4cfc9 Apr 22 19:24:13.462188 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.462164 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f2d314_d41f_4b5f_990f_0750a69d5f47.slice/crio-a587dc1336ec5c730ac0c972fbcc10d5da8cfe109571a59231d311bd4528d815 WatchSource:0}: Error finding container a587dc1336ec5c730ac0c972fbcc10d5da8cfe109571a59231d311bd4528d815: Status 404 returned error can't find the container with id a587dc1336ec5c730ac0c972fbcc10d5da8cfe109571a59231d311bd4528d815 Apr 22 19:24:13.463357 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.463303 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c6ea94_6f38_4282_bf4f_2a7ad7fa513a.slice/crio-73d22308a86344444d10db6a80cef88bf9e4523a83035f526471e575c6a2b5e5 WatchSource:0}: Error finding container 73d22308a86344444d10db6a80cef88bf9e4523a83035f526471e575c6a2b5e5: Status 404 returned error can't find the container with id 73d22308a86344444d10db6a80cef88bf9e4523a83035f526471e575c6a2b5e5 Apr 22 19:24:13.464565 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.464302 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357c3ccf_e489_43f2_ae48_f23177ef4481.slice/crio-2e55300bbbb02b7e965cfef60038c3d14c76d39c86b7125184fbcc88dfbaf230 WatchSource:0}: Error finding container 2e55300bbbb02b7e965cfef60038c3d14c76d39c86b7125184fbcc88dfbaf230: Status 404 returned error can't find the container with id 2e55300bbbb02b7e965cfef60038c3d14c76d39c86b7125184fbcc88dfbaf230 Apr 22 19:24:13.464909 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:24:13.464851 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2d3136a_c73d_4ecd_a4e9_a9c3c3605915.slice/crio-55a00f3e1ab0fc0ed6807c9b75f465750d1bfbb85bba3b3d3f81047239132986 WatchSource:0}: Error finding container 55a00f3e1ab0fc0ed6807c9b75f465750d1bfbb85bba3b3d3f81047239132986: Status 404 returned error can't find the container with id 55a00f3e1ab0fc0ed6807c9b75f465750d1bfbb85bba3b3d3f81047239132986 Apr 22 19:24:13.671674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.671472 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:19:11 +0000 UTC" deadline="2027-10-15 07:38:49.059668994 +0000 UTC" Apr 22 19:24:13.671674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.671669 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12972h14m35.388004131s" Apr 22 19:24:13.759974 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.759893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" event={"ID":"206ff994154571336dcc99880b36f4f2","Type":"ContainerStarted","Data":"3cf12239bf5b90dbb993f97842931eadfa70d08780b2ddba44763aa1947f606a"} Apr 22 19:24:13.761088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.761064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l2kj9" event={"ID":"32f2d314-d41f-4b5f-990f-0750a69d5f47","Type":"ContainerStarted","Data":"a587dc1336ec5c730ac0c972fbcc10d5da8cfe109571a59231d311bd4528d815"} Apr 22 19:24:13.762043 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.762018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"525bb7735f49d26587e666267863f7d7ac49539b2dcac6cdaf3b21ea19c4cfc9"} Apr 22 19:24:13.762910 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.762888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-frj6d" event={"ID":"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6","Type":"ContainerStarted","Data":"0e6c614401a328519e6a73b43e178f359104910f0a9caa18b4f50d31f4ef48bb"} Apr 22 19:24:13.764006 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.763980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerStarted","Data":"55a00f3e1ab0fc0ed6807c9b75f465750d1bfbb85bba3b3d3f81047239132986"} Apr 22 19:24:13.764968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.764949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fkhrk" event={"ID":"388a5b94-24f5-48ff-aa37-825c8e5a0b2a","Type":"ContainerStarted","Data":"91b0bdc931e34d96808f860ba5a8c406abf8624b2f5e4b40c8204df92064c391"} Apr 22 19:24:13.765836 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.765820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-prwb9" event={"ID":"357c3ccf-e489-43f2-ae48-f23177ef4481","Type":"ContainerStarted","Data":"2e55300bbbb02b7e965cfef60038c3d14c76d39c86b7125184fbcc88dfbaf230"} Apr 22 19:24:13.766731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.766709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-klmr4" event={"ID":"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a","Type":"ContainerStarted","Data":"73d22308a86344444d10db6a80cef88bf9e4523a83035f526471e575c6a2b5e5"} Apr 22 19:24:13.767796 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.767773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" event={"ID":"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13","Type":"ContainerStarted","Data":"fe60045c4bc116eeaa4d1994e1868be202e20537faa3a153fcf44b8520239874"} Apr 22 19:24:13.768670 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.768645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" event={"ID":"e93892c7-b516-4779-8a8c-bfdaed0ccee1","Type":"ContainerStarted","Data":"3f8c77dbcb6cb9a6d940b2fbc772c26223cbe4820218449ec0693b3f056961b5"} Apr 22 19:24:13.774932 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:13.774891 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" podStartSLOduration=2.774880246 podStartE2EDuration="2.774880246s" podCreationTimestamp="2026-04-22 19:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:13.774465715 +0000 UTC m=+3.559455588" watchObservedRunningTime="2026-04-22 19:24:13.774880246 +0000 UTC m=+3.559870118" Apr 22 19:24:14.261459 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.261424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:14.261621 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.261593 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:14.261693 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.261655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.261636371 +0000 UTC m=+6.046626224 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:14.362557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.362460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:14.362719 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.362677 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:14.362719 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.362695 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:14.362719 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.362708 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:14.363012 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.362765 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:16.362745821 +0000 UTC m=+6.147735672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:14.751433 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.750519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:14.751433 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.750683 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:14.751433 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.751175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:14.751433 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:14.751341 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:14.788946 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.787828 2576 generic.go:358] "Generic (PLEG): container finished" podID="a0393c54dc21367a93a647d0297c0f90" containerID="650eb0ecb37db787c18d069bfb49b52815d96a159099aba0a66679eca721b0ea" exitCode=0 Apr 22 19:24:14.788946 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:14.788744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerDied","Data":"650eb0ecb37db787c18d069bfb49b52815d96a159099aba0a66679eca721b0ea"} Apr 22 19:24:15.803490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:15.803450 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerStarted","Data":"bb360b7cfd1f7b8e4b434e1d66f66bcabc21dca2a1be42c2901d550e233078d7"} Apr 22 19:24:16.277527 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:16.277476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:16.277718 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.277640 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:16.277788 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.277778 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.27775972 +0000 UTC m=+10.062749574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:16.378537 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:16.378486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:16.378697 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.378675 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:16.378697 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.378692 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:16.378896 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.378704 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:16.378896 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.378764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:20.378746496 +0000 UTC m=+10.163736359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:16.751979 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:16.751899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:16.752137 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.752039 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:16.752137 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:16.752083 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:16.752250 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:16.752209 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:18.750151 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:18.750118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:18.750614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:18.750127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:18.750614 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:18.750241 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:18.750614 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:18.750335 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:20.309927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:20.309891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:20.310379 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.310078 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:20.310379 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.310149 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:28.310129486 +0000 UTC m=+18.095119347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:20.411584 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:20.410975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:20.411584 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.411136 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:20.411584 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.411153 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:20.411584 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.411165 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:20.411584 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.411221 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:28.411202593 +0000 UTC m=+18.196192460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:20.752047 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:20.751346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:20.752047 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.751459 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:20.752047 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:20.751861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:20.752047 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:20.751968 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:22.749914 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:22.749823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:22.750346 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:22.749823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:22.750346 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:22.749957 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:22.750346 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:22.750040 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:24.749673 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:24.749637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:24.750255 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:24.749778 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:24.750255 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:24.749820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:24.750255 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:24.749932 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:26.749728 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:26.749693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:26.749728 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:26.749729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:26.750204 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:26.749842 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:26.750204 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:26.749976 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:28.369942 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:28.369907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:28.370339 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.370028 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:28.370339 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.370093 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:44.370070657 +0000 UTC m=+34.155060532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:28.470632 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:28.470598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:28.470788 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.470731 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:28.470788 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.470747 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:28.470788 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.470756 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:28.470926 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.470807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:44.470791801 +0000 UTC m=+34.255781655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:28.750120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:28.750034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:28.750120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:28.750034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:28.750326 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.750175 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:28.750326 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:28.750279 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:30.750704 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.750394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:30.751314 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:30.750777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:30.751314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.750485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:30.751314 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:30.751092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:30.832087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.832054 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-prwb9" event={"ID":"357c3ccf-e489-43f2-ae48-f23177ef4481","Type":"ContainerStarted","Data":"f9388fc35462c3ed51d0d8cd105aabff0cc05fc2e6b07da0a1a89729fdbed319"} Apr 22 19:24:30.833340 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.833312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-klmr4" event={"ID":"c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a","Type":"ContainerStarted","Data":"c19757956f546c0be11e651b1456782f9a0b741461563dbe72d2c7d3b48789b2"} Apr 22 19:24:30.834550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.834527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" event={"ID":"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13","Type":"ContainerStarted","Data":"91bbf61082f0faf27c5483e93b677ee0744eb891e6135e7f9f18ad6355e4684c"} Apr 22 19:24:30.835916 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.835887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" event={"ID":"e93892c7-b516-4779-8a8c-bfdaed0ccee1","Type":"ContainerStarted","Data":"c1f299a454e6345f80c3e82eabbeb9ece0a68b1516c016cbe47f6909dadfdfe8"} Apr 22 19:24:30.837819 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.837800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:24:30.838263 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.838239 2576 generic.go:358] "Generic (PLEG): container finished" podID="3a50980a-3501-4203-96f8-93510d032673" containerID="e8f870eb5ca405e919733a8f331969d228a2c086fb4d981eb74e4c17b7b1ac25" exitCode=1 Apr 22 19:24:30.838345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.838309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"d509b266100477aea38749a5ea788acd3a4822681c7f93e28ac241b85cf59934"} Apr 22 19:24:30.838345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.838337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"7ccdffa29a4665b983372d24fd71a439c1e3dadc770192963d88174b168efab4"} Apr 22 19:24:30.838449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.838347 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerDied","Data":"e8f870eb5ca405e919733a8f331969d228a2c086fb4d981eb74e4c17b7b1ac25"} Apr 22 19:24:30.838449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.838357 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"3e1732578d40d4c017c95d69e8bb73690c66fb673be3e6f44f3f31f038433b87"} Apr 22 19:24:30.839602 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.839571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-frj6d" event={"ID":"9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6","Type":"ContainerStarted","Data":"e2e74b96ce9ad61910f46ce7d730c40ce3a909430863384bb42e116bb076af85"} Apr 22 19:24:30.841020 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.840999 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="8fee7737fdcfc343549c6049f6722478ed38f0fd2c3985df3ec64ab2e5825c65" exitCode=0 Apr 22 19:24:30.841106 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.841057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"8fee7737fdcfc343549c6049f6722478ed38f0fd2c3985df3ec64ab2e5825c65"} Apr 22 19:24:30.842828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.842791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fkhrk" event={"ID":"388a5b94-24f5-48ff-aa37-825c8e5a0b2a","Type":"ContainerStarted","Data":"b6aeb7dbdf50c3e0b3956211b25e7f6dc13522ac102a42497466486a969d83e7"} Apr 22 19:24:30.852837 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.852780 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" podStartSLOduration=19.852754224 podStartE2EDuration="19.852754224s" podCreationTimestamp="2026-04-22 19:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:15.81909421 +0000 UTC m=+5.604084084" watchObservedRunningTime="2026-04-22 19:24:30.852754224 +0000 UTC m=+20.637744073" Apr 22 19:24:30.853234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.853212 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-prwb9" podStartSLOduration=3.843795427 podStartE2EDuration="20.853205431s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.466240058 +0000 UTC m=+3.251229921" lastFinishedPulling="2026-04-22 19:24:30.475650062 +0000 UTC m=+20.260639925" observedRunningTime="2026-04-22 19:24:30.852546584 +0000 UTC m=+20.637536453" watchObservedRunningTime="2026-04-22 19:24:30.853205431 +0000 UTC m=+20.638195303" Apr 22 19:24:30.870630 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.870594 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-klmr4" podStartSLOduration=4.16225995 podStartE2EDuration="20.870584989s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.465234405 +0000 UTC m=+3.250224269" lastFinishedPulling="2026-04-22 19:24:30.173559443 +0000 UTC m=+19.958549308" observedRunningTime="2026-04-22 19:24:30.870133967 +0000 UTC m=+20.655123839" watchObservedRunningTime="2026-04-22 19:24:30.870584989 +0000 UTC m=+20.655574860" Apr 22 19:24:30.952374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:30.952332 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fkhrk" podStartSLOduration=4.245660629 podStartE2EDuration="20.952315366s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.458106592 +0000 UTC m=+3.243096443" lastFinishedPulling="2026-04-22 19:24:30.164761324 +0000 UTC m=+19.949751180" observedRunningTime="2026-04-22 19:24:30.950292998 +0000 UTC m=+20.735282868" watchObservedRunningTime="2026-04-22 19:24:30.952315366 +0000 UTC m=+20.737305238" Apr 22 19:24:31.006552 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.006487 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qjfjk" podStartSLOduration=4.284580517 podStartE2EDuration="21.006474131s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.456476544 +0000 UTC m=+3.241466396" lastFinishedPulling="2026-04-22 19:24:30.178370157 +0000 UTC m=+19.963360010" observedRunningTime="2026-04-22 19:24:31.005699621 +0000 UTC m=+20.790689494" watchObservedRunningTime="2026-04-22 19:24:31.006474131 +0000 UTC m=+20.791464003" Apr 22 19:24:31.006937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.006906 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-frj6d" podStartSLOduration=4.294864554 podStartE2EDuration="21.006898284s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.461835468 +0000 UTC m=+3.246825330" lastFinishedPulling="2026-04-22 19:24:30.173869204 +0000 UTC m=+19.958859060" observedRunningTime="2026-04-22 19:24:30.980667072 +0000 UTC m=+20.765656944" watchObservedRunningTime="2026-04-22 19:24:31.006898284 +0000 UTC m=+20.791888155" Apr 22 19:24:31.846039 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.845953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-l2kj9" event={"ID":"32f2d314-d41f-4b5f-990f-0750a69d5f47","Type":"ContainerStarted","Data":"8fde720d37480468df7b58b6e6f11237c5469fc720c966b2549908494e7fa1a5"} Apr 22 19:24:31.848785 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.848709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:24:31.849142 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.849121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"0656aa3f4cb04c1bcfb3ab3dbd9e17675cd7acc7e4fce7fb75f28ada541ac346"} Apr 22 19:24:31.849240 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.849153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"890eb87062b3190a3457f3fa03b9bd5b060c614d27798b2e1fb62beceb2d0632"} Apr 22 19:24:31.863457 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.863418 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-l2kj9" podStartSLOduration=5.1539910859999996 podStartE2EDuration="21.863403105s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.464151031 +0000 UTC m=+3.249140884" lastFinishedPulling="2026-04-22 19:24:30.173563044 +0000 UTC m=+19.958552903" observedRunningTime="2026-04-22 19:24:31.863066469 +0000 UTC m=+21.648056355" watchObservedRunningTime="2026-04-22 19:24:31.863403105 +0000 UTC m=+21.648392977" Apr 22 19:24:31.914096 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:31.914075 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:24:32.708178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.708062 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:24:31.914091666Z","UUID":"274487fc-5fe3-4976-8ed7-7f2dcf355c1e","Handler":null,"Name":"","Endpoint":""} Apr 22 19:24:32.711004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.710978 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:24:32.711141 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.711011 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:24:32.749794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.749760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:32.749974 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.749773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:32.749974 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:32.749891 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:32.750088 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:32.749968 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:32.852388 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:32.852354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" event={"ID":"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13","Type":"ContainerStarted","Data":"a2b9f6058d4fa635e1aabf70771382187df06dbdc17443fecff38a58b4ad854d"} Apr 22 19:24:33.858231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:33.857944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" event={"ID":"cb824fe2-b0a1-4056-99d2-0bf9b49c0d13","Type":"ContainerStarted","Data":"63fbbc83dde554ad4c56e1a57b02186c49dcc3fee2406ef014b15343fe6a522a"} Apr 22 19:24:33.861282 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:33.861256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:24:33.861684 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:33.861648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"d405f06832ff3e5e9fba9f28bb6c684dc6d601cbf2ab7acc5c5e8c7cd993c65a"} Apr 22 19:24:33.875863 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:33.875820 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2qrtw" podStartSLOduration=4.352302519 podStartE2EDuration="23.875805407s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.461848411 +0000 UTC m=+3.246838272" lastFinishedPulling="2026-04-22 19:24:32.985351296 +0000 UTC m=+22.770341160" observedRunningTime="2026-04-22 19:24:33.875495936 +0000 UTC m=+23.660485822" watchObservedRunningTime="2026-04-22 19:24:33.875805407 +0000 UTC m=+23.660795279" Apr 22 19:24:34.750353 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:34.750323 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:34.750552 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:34.750442 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:34.750552 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:34.750490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:34.750664 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:34.750634 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:35.475553 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.475317 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:35.476303 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.476126 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:35.866609 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.866576 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="bee3790ffbbc0419083c0897400dd94f9df94e8aa7173d03551e0f3f131a950c" exitCode=0 Apr 22 19:24:35.866773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.866648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"bee3790ffbbc0419083c0897400dd94f9df94e8aa7173d03551e0f3f131a950c"} Apr 22 19:24:35.869741 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.869725 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:24:35.870102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"c701c35fd45cd391c8639e5acad31200f057b7370dcc0db0b36d4e34af1c1ffb"} Apr 22 19:24:35.870269 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870250 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:35.870423 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870408 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:35.870512 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:35.870669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870653 2576 scope.go:117] "RemoveContainer" containerID="e8f870eb5ca405e919733a8f331969d228a2c086fb4d981eb74e4c17b7b1ac25" Apr 22 19:24:35.870760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.870745 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fkhrk" Apr 22 19:24:35.884887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:35.884870 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:36.750642 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.750611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:36.751055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.750611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:36.751055 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:36.750752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:36.751055 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:36.750824 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:36.874491 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.874471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:24:36.874857 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.874832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" event={"ID":"3a50980a-3501-4203-96f8-93510d032673","Type":"ContainerStarted","Data":"4c98a9b674d97e3731edcb79b5abacad6dff58a8d15d3ada09d64e7ca8d5c2e7"} Apr 22 19:24:36.875182 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.875154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:36.876678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.876655 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="56717c1e4615afa5123b9c9c98266aa8a012687eb44082a80aa1e06823f4b170" exitCode=0 Apr 22 19:24:36.876760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.876694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"56717c1e4615afa5123b9c9c98266aa8a012687eb44082a80aa1e06823f4b170"} Apr 22 19:24:36.889228 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.889208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:24:36.908321 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:36.908002 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" podStartSLOduration=10.012698243 podStartE2EDuration="26.90794573s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.46311397 +0000 UTC m=+3.248103821" lastFinishedPulling="2026-04-22 19:24:30.358361453 +0000 UTC m=+20.143351308" observedRunningTime="2026-04-22 19:24:36.907392021 +0000 UTC m=+26.692381893" watchObservedRunningTime="2026-04-22 19:24:36.90794573 +0000 UTC m=+26.692935602" Apr 22 19:24:37.243236 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.243062 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sxrzv"] Apr 22 19:24:37.243372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.243322 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:37.243428 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:37.243403 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:37.246021 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.245996 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rqq85"] Apr 22 19:24:37.246142 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.246093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:37.246192 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:37.246170 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:37.882724 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.882630 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="0db0b0e2cc347c97a1672dbc8f6136cd680672bc1b19161927bc5d614998fef1" exitCode=0 Apr 22 19:24:37.883081 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:37.882721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"0db0b0e2cc347c97a1672dbc8f6136cd680672bc1b19161927bc5d614998fef1"} Apr 22 19:24:38.750109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:38.750046 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:38.750326 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:38.750165 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:38.750326 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:38.750057 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:38.750326 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:38.750285 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:40.751139 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:40.751102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:40.751823 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:40.751200 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:40.751823 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:40.751286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:40.751823 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:40.751405 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:42.750380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:42.750345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:42.750842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:42.750353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:42.750842 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:42.750483 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:24:42.750842 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:42.750578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sxrzv" podUID="52bb575c-7bf3-4562-b917-b5d06e683525" Apr 22 19:24:43.014754 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.014688 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeReady" Apr 22 19:24:43.014884 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.014821 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:24:43.060075 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.060046 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9tp5r"] Apr 22 19:24:43.065190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.065168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hhwkd"] Apr 22 19:24:43.065344 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.065326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.067741 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.067713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:24:43.067847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.067740 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:24:43.067905 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.067876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:24:43.068012 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.067992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:24:43.068727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.068703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.071401 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.071292 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:24:43.071401 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.071361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:24:43.071557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.071487 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:24:43.072321 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.072286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9tp5r"] Apr 22 19:24:43.075558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.075538 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhwkd"] Apr 22 19:24:43.178993 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.178953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.179164 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.179018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.179164 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.179054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8046910-c087-4b0d-a917-3216261f41d0-config-volume\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.179164 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.179080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84n4p\" (UniqueName: \"kubernetes.io/projected/15547e0d-8f10-470f-a80b-0cb53add2696-kube-api-access-84n4p\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.179164 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.179099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8046910-c087-4b0d-a917-3216261f41d0-tmp-dir\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.179317 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.179188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxm7x\" (UniqueName: \"kubernetes.io/projected/f8046910-c087-4b0d-a917-3216261f41d0-kube-api-access-dxm7x\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.280304 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8046910-c087-4b0d-a917-3216261f41d0-config-volume\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84n4p\" (UniqueName: \"kubernetes.io/projected/15547e0d-8f10-470f-a80b-0cb53add2696-kube-api-access-84n4p\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8046910-c087-4b0d-a917-3216261f41d0-tmp-dir\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxm7x\" (UniqueName: \"kubernetes.io/projected/f8046910-c087-4b0d-a917-3216261f41d0-kube-api-access-dxm7x\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.280433 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.280490 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.280495 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:43.780478117 +0000 UTC m=+33.565467967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:43.280866 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.280561 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:43.280866 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.280609 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:43.780592547 +0000 UTC m=+33.565582398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:24:43.280866 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.280760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8046910-c087-4b0d-a917-3216261f41d0-tmp-dir\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.281243 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.281225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8046910-c087-4b0d-a917-3216261f41d0-config-volume\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.291611 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.291583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxm7x\" (UniqueName: \"kubernetes.io/projected/f8046910-c087-4b0d-a917-3216261f41d0-kube-api-access-dxm7x\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.291715 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.291689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84n4p\" (UniqueName: \"kubernetes.io/projected/15547e0d-8f10-470f-a80b-0cb53add2696-kube-api-access-84n4p\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.784219 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.784186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:43.784706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.784239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:43.784706 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.784332 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:43.784706 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.784352 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:43.784706 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.784394 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:44.784376283 +0000 UTC m=+34.569366135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:43.784706 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:43.784421 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:44.784402749 +0000 UTC m=+34.569392598 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:24:43.896349 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.896321 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="ebcc3399a2e25197fd0c11d24421cbdf2204847fb10fa5fdbaa90152932d0a53" exitCode=0 Apr 22 19:24:43.896486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:43.896378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"ebcc3399a2e25197fd0c11d24421cbdf2204847fb10fa5fdbaa90152932d0a53"} Apr 22 19:24:44.388253 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.388214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:44.388469 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.388323 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:44.388469 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.388382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:16.388368717 +0000 UTC m=+66.173358566 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:24:44.489163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.489128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:44.489276 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.489258 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:24:44.489276 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.489272 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:24:44.489350 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.489285 2576 projected.go:194] Error preparing data for projected volume kube-api-access-9jg2g for pod openshift-network-diagnostics/network-check-target-sxrzv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:44.489350 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.489334 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g podName:52bb575c-7bf3-4562-b917-b5d06e683525 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:16.489321974 +0000 UTC m=+66.274311824 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9jg2g" (UniqueName: "kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g") pod "network-check-target-sxrzv" (UID: "52bb575c-7bf3-4562-b917-b5d06e683525") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:24:44.750564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.750474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:24:44.750564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.750525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:24:44.753318 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.753296 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:24:44.754486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.754467 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:24:44.754596 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.754467 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:24:44.754596 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.754527 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:24:44.754596 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.754546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wrgld\"" Apr 22 19:24:44.791285 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.791264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:44.791585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.791312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:44.791585 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.791407 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:44.791585 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.791461 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:46.791447445 +0000 UTC m=+36.576437300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:24:44.791585 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.791414 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:44.791585 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:44.791546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:46.791532763 +0000 UTC m=+36.576522613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:44.900770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.900739 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2d3136a-c73d-4ecd-a4e9-a9c3c3605915" containerID="f0b2cdc1c49c9117ea978426f51dfd08b3f7e0d070000565e85b7a73f9dab497" exitCode=0 Apr 22 19:24:44.900919 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:44.900805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerDied","Data":"f0b2cdc1c49c9117ea978426f51dfd08b3f7e0d070000565e85b7a73f9dab497"} Apr 22 19:24:45.908301 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:45.908271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" event={"ID":"b2d3136a-c73d-4ecd-a4e9-a9c3c3605915","Type":"ContainerStarted","Data":"1d5148a85ddb6343e18917863e07f34aa51e22caad7c6e7c63c583cdd5759551"} Apr 22 19:24:45.937397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:45.937350 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4q6q8" podStartSLOduration=5.799299764 podStartE2EDuration="35.937338016s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:24:13.466640193 +0000 UTC m=+3.251630043" lastFinishedPulling="2026-04-22 19:24:43.604678445 +0000 UTC m=+33.389668295" observedRunningTime="2026-04-22 19:24:45.9357496 +0000 UTC m=+35.720739510" watchObservedRunningTime="2026-04-22 19:24:45.937338016 +0000 UTC m=+35.722327887" Apr 22 19:24:46.803955 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:46.803924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:46.804087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:46.803989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:46.804087 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:46.804069 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:46.804196 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:46.804109 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:46.804196 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:46.804127 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:50.804112392 +0000 UTC m=+40.589102243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:24:46.804196 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:46.804159 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:50.804143873 +0000 UTC m=+40.589133737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:50.835309 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:50.835271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:50.835725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:50.835328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:50.835725 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:50.835405 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:50.835725 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:50.835426 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:50.835725 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:50.835464 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:58.835449571 +0000 UTC m=+48.620439421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:24:50.835725 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:50.835478 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:24:58.835471318 +0000 UTC m=+48.620461168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:58.890070 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:58.890029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:24:58.890557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:24:58.890083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:24:58.890557 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:58.890173 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:24:58.890557 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:58.890180 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:24:58.890557 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:58.890234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.890218742 +0000 UTC m=+64.675208593 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:24:58.890557 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:24:58.890247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:14.890241223 +0000 UTC m=+64.675231073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:25:08.923121 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:08.923091 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9f6tl" Apr 22 19:25:14.891212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:14.891172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:25:14.891703 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:14.891228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:25:14.891703 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:14.891322 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:14.891703 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:14.891324 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:14.891703 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:14.891383 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:46.891368463 +0000 UTC m=+96.676358313 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:25:14.891703 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:14.891398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:25:46.891391973 +0000 UTC m=+96.676381822 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:25:16.401231 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.401191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:25:16.403817 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.403796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:25:16.411918 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:16.411900 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:25:16.411999 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:16.411953 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:20.411938794 +0000 UTC m=+130.196928649 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : secret "metrics-daemon-secret" not found Apr 22 19:25:16.502137 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.502100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:25:16.504897 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.504874 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:25:16.514976 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.514950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:25:16.527306 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.527282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jg2g\" (UniqueName: \"kubernetes.io/projected/52bb575c-7bf3-4562-b917-b5d06e683525-kube-api-access-9jg2g\") pod \"network-check-target-sxrzv\" (UID: \"52bb575c-7bf3-4562-b917-b5d06e683525\") " pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:25:16.567570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.567547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wrgld\"" Apr 22 19:25:16.575592 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.575577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:25:16.694137 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.694108 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sxrzv"] Apr 22 19:25:16.697919 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:25:16.697890 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52bb575c_7bf3_4562_b917_b5d06e683525.slice/crio-eb1727545b0e7ec99ac48a9231de6f3723af1ff54497a27d4372c37b8456c8b4 WatchSource:0}: Error finding container eb1727545b0e7ec99ac48a9231de6f3723af1ff54497a27d4372c37b8456c8b4: Status 404 returned error can't find the container with id eb1727545b0e7ec99ac48a9231de6f3723af1ff54497a27d4372c37b8456c8b4 Apr 22 19:25:16.968186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:16.968106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sxrzv" event={"ID":"52bb575c-7bf3-4562-b917-b5d06e683525","Type":"ContainerStarted","Data":"eb1727545b0e7ec99ac48a9231de6f3723af1ff54497a27d4372c37b8456c8b4"} Apr 22 19:25:19.974831 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:19.974792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sxrzv" event={"ID":"52bb575c-7bf3-4562-b917-b5d06e683525","Type":"ContainerStarted","Data":"086a351cb2eb18f3e8cf7a5d79db4fc4c6a019b5d70e4fc964195d58f045c6ae"} Apr 22 19:25:19.975292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:19.974936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:25:19.993780 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:19.993728 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sxrzv" podStartSLOduration=67.42133918 podStartE2EDuration="1m9.993714315s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:25:16.699673042 +0000 UTC m=+66.484662892" lastFinishedPulling="2026-04-22 19:25:19.272048174 +0000 UTC m=+69.057038027" observedRunningTime="2026-04-22 19:25:19.993347255 +0000 UTC m=+69.778337122" watchObservedRunningTime="2026-04-22 19:25:19.993714315 +0000 UTC m=+69.778704191" Apr 22 19:25:46.910691 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:46.910649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:25:46.910691 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:46.910702 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:25:46.911131 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:46.910790 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:25:46.911131 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:46.910791 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:25:46.911131 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:46.910841 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls podName:f8046910-c087-4b0d-a917-3216261f41d0 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:50.910825688 +0000 UTC m=+160.695815538 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls") pod "dns-default-hhwkd" (UID: "f8046910-c087-4b0d-a917-3216261f41d0") : secret "dns-default-metrics-tls" not found Apr 22 19:25:46.911131 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:25:46.910855 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert podName:15547e0d-8f10-470f-a80b-0cb53add2696 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:50.910848662 +0000 UTC m=+160.695838512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert") pod "ingress-canary-9tp5r" (UID: "15547e0d-8f10-470f-a80b-0cb53add2696") : secret "canary-serving-cert" not found Apr 22 19:25:50.979236 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:25:50.979204 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sxrzv" Apr 22 19:26:16.569147 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.569109 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4"] Apr 22 19:26:16.571154 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.571139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.572479 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.572455 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79d8474b76-rbnlg"] Apr 22 19:26:16.573455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.573435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bvkmj\"" Apr 22 19:26:16.573579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.573562 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:26:16.573663 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.573645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:26:16.573933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.573906 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.574009 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.573995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:26:16.576971 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.576951 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:26:16.578099 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.577947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:26:16.578290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.578273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:26:16.578632 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.578616 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:26:16.578926 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.578903 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:26:16.578926 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.578914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:26:16.579024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.578962 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s4sfm\"" Apr 22 19:26:16.579514 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.579488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:26:16.586345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.586326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4"] Apr 22 19:26:16.587443 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.587425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79d8474b76-rbnlg"] Apr 22 19:26:16.677062 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.677026 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b7xnl"] Apr 22 19:26:16.678773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.678755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.681181 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.681160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:26:16.681310 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.681257 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:26:16.681474 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.681460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:26:16.681572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.681546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wklrt\"" Apr 22 19:26:16.681572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.681564 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:26:16.687007 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.686986 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:26:16.689776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.689756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b7xnl"] Apr 22 19:26:16.709984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.709956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.710129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8fzj\" (UniqueName: \"kubernetes.io/projected/9a37ca65-2a95-4238-9686-942fb21e4095-kube-api-access-d8fzj\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.710129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6b05d52-0d1e-4259-a36b-3b1d0e753715-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.710129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mvx\" (UniqueName: \"kubernetes.io/projected/d6b05d52-0d1e-4259-a36b-3b1d0e753715-kube-api-access-24mvx\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.710274 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.710274 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-stats-auth\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.710357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-default-certificate\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.710357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.710307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.810599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24mvx\" (UniqueName: \"kubernetes.io/projected/d6b05d52-0d1e-4259-a36b-3b1d0e753715-kube-api-access-24mvx\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.810599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-tmp\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-snapshots\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-serving-cert\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-stats-auth\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-service-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-default-certificate\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:16.810768 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.810814 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:16.810842 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:16.810872 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:17.310855604 +0000 UTC m=+127.095845463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.810930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:16.810948 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:17.310927674 +0000 UTC m=+127.095917527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:16.811026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:17.31101298 +0000 UTC m=+127.096002831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.811054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8fzj\" (UniqueName: \"kubernetes.io/projected/9a37ca65-2a95-4238-9686-942fb21e4095-kube-api-access-d8fzj\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.811087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66mw\" (UniqueName: \"kubernetes.io/projected/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-kube-api-access-d66mw\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.811241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.811131 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6b05d52-0d1e-4259-a36b-3b1d0e753715-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.811735 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.811718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6b05d52-0d1e-4259-a36b-3b1d0e753715-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.813074 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.813054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-stats-auth\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.813266 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.813249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-default-certificate\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.840279 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.822073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24mvx\" (UniqueName: \"kubernetes.io/projected/d6b05d52-0d1e-4259-a36b-3b1d0e753715-kube-api-access-24mvx\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:16.840731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.840713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8fzj\" (UniqueName: \"kubernetes.io/projected/9a37ca65-2a95-4238-9686-942fb21e4095-kube-api-access-d8fzj\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:16.911982 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.911955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d66mw\" (UniqueName: \"kubernetes.io/projected/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-kube-api-access-d66mw\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-tmp\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-snapshots\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-serving-cert\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912252 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-service-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912252 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.912447 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.912421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-tmp\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.913075 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.913047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-snapshots\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.913181 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.913076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-service-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.913469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.913450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.914300 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.914270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-serving-cert\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.921044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.921024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66mw\" (UniqueName: \"kubernetes.io/projected/1a23807b-8ae8-4f57-aaa4-f01cc3d1f680-kube-api-access-d66mw\") pod \"insights-operator-585dfdc468-b7xnl\" (UID: \"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680\") " pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:16.988730 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:16.988704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" Apr 22 19:26:17.100413 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:17.100287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-b7xnl"] Apr 22 19:26:17.104831 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:17.104805 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a23807b_8ae8_4f57_aaa4_f01cc3d1f680.slice/crio-94f71852dbd370273ff74b2462a357ba31a477dac08c7c5f410148069085c4ec WatchSource:0}: Error finding container 94f71852dbd370273ff74b2462a357ba31a477dac08c7c5f410148069085c4ec: Status 404 returned error can't find the container with id 94f71852dbd370273ff74b2462a357ba31a477dac08c7c5f410148069085c4ec Apr 22 19:26:17.314088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:17.314055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:17.314100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:17.314122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:17.314213 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:17.314232 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:18.314219442 +0000 UTC m=+128.099209292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:17.314211 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:17.314290 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:17.314264 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:18.31425174 +0000 UTC m=+128.099241595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:17.314542 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:17.314305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:18.314286405 +0000 UTC m=+128.099276269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:18.082804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:18.082763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" event={"ID":"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680","Type":"ContainerStarted","Data":"94f71852dbd370273ff74b2462a357ba31a477dac08c7c5f410148069085c4ec"} Apr 22 19:26:18.323367 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:18.323311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:18.323367 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:18.323377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:18.323400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:18.323469 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:18.323529 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:18.323557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:20.323543459 +0000 UTC m=+130.108533309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:18.323572 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:20.323565928 +0000 UTC m=+130.108555777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:18.323601 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:18.323582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:20.323576467 +0000 UTC m=+130.108566317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:20.090115 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.089328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" event={"ID":"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680","Type":"ContainerStarted","Data":"e41539ae9b42c7830c9a88a8c4724ffea90b84096e0883a85be6b08cf6311bd9"} Apr 22 19:26:20.108177 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.108123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" podStartSLOduration=2.026209973 podStartE2EDuration="4.108106889s" podCreationTimestamp="2026-04-22 19:26:16 +0000 UTC" firstStartedPulling="2026-04-22 19:26:17.106479265 +0000 UTC m=+126.891469115" lastFinishedPulling="2026-04-22 19:26:19.188376178 +0000 UTC m=+128.973366031" observedRunningTime="2026-04-22 19:26:20.106765685 +0000 UTC m=+129.891755582" watchObservedRunningTime="2026-04-22 19:26:20.108106889 +0000 UTC m=+129.893096761" Apr 22 19:26:20.340876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.340796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:20.340876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.340840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:20.340876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.340861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:20.341105 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.340944 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:20.341105 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.340944 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:20.341105 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.341003 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:24.340990659 +0000 UTC m=+134.125980509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:20.341105 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.341018 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:24.341012263 +0000 UTC m=+134.126002114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:20.341105 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.341030 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:24.341023056 +0000 UTC m=+134.126012907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:20.441955 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:20.441925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:26:20.442124 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.442080 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:26:20.442192 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:20.442151 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs podName:342209dc-2f51-4fc2-a96f-a19424f86d57 nodeName:}" failed. No retries permitted until 2026-04-22 19:28:22.442132857 +0000 UTC m=+252.227122709 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs") pod "network-metrics-daemon-rqq85" (UID: "342209dc-2f51-4fc2-a96f-a19424f86d57") : secret "metrics-daemon-secret" not found Apr 22 19:26:22.102229 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:22.102201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-frj6d_9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6/dns-node-resolver/0.log" Apr 22 19:26:23.302674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:23.302651 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-klmr4_c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a/node-ca/0.log" Apr 22 19:26:24.374689 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.374643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:24.374689 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.374697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.374720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.374794 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.374796 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.374841 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:32.374826625 +0000 UTC m=+142.159816474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.374856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:32.374848715 +0000 UTC m=+142.159838565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:24.375096 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.374867 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:32.374860793 +0000 UTC m=+142.159850643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:24.580975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.580945 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn"] Apr 22 19:26:24.582701 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.582686 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" Apr 22 19:26:24.585169 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.585147 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-7ttmt\"" Apr 22 19:26:24.585281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.585148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:24.586264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.586246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:26:24.593950 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.593929 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn"] Apr 22 19:26:24.677970 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.677883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxp8q\" (UniqueName: \"kubernetes.io/projected/6ea4f413-e0ef-4b63-9f20-54a4234931a6-kube-api-access-gxp8q\") pod \"volume-data-source-validator-7c6cbb6c87-dj2zn\" (UID: \"6ea4f413-e0ef-4b63-9f20-54a4234931a6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" Apr 22 19:26:24.686044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.686016 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt"] Apr 22 19:26:24.687903 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.687887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.690403 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.690366 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:24.690554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.690473 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:26:24.690554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.690523 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:26:24.690823 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.690810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-dcfc9\"" Apr 22 19:26:24.698233 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.698212 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt"] Apr 22 19:26:24.778328 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.778302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxp8q\" (UniqueName: \"kubernetes.io/projected/6ea4f413-e0ef-4b63-9f20-54a4234931a6-kube-api-access-gxp8q\") pod \"volume-data-source-validator-7c6cbb6c87-dj2zn\" (UID: \"6ea4f413-e0ef-4b63-9f20-54a4234931a6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" Apr 22 19:26:24.778478 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.778351 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.778478 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.778373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccflx\" (UniqueName: \"kubernetes.io/projected/a7f581bb-b565-4a28-8445-ef068ddffac6-kube-api-access-ccflx\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.786286 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.786256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxp8q\" (UniqueName: \"kubernetes.io/projected/6ea4f413-e0ef-4b63-9f20-54a4234931a6-kube-api-access-gxp8q\") pod \"volume-data-source-validator-7c6cbb6c87-dj2zn\" (UID: \"6ea4f413-e0ef-4b63-9f20-54a4234931a6\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" Apr 22 19:26:24.879417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.879379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.879417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.879421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccflx\" (UniqueName: \"kubernetes.io/projected/a7f581bb-b565-4a28-8445-ef068ddffac6-kube-api-access-ccflx\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.879625 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.879535 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:26:24.879625 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:24.879595 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls podName:a7f581bb-b565-4a28-8445-ef068ddffac6 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:25.379580116 +0000 UTC m=+135.164569966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kthnt" (UID: "a7f581bb-b565-4a28-8445-ef068ddffac6") : secret "samples-operator-tls" not found Apr 22 19:26:24.890010 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.889987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccflx\" (UniqueName: \"kubernetes.io/projected/a7f581bb-b565-4a28-8445-ef068ddffac6-kube-api-access-ccflx\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:24.891792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:24.891776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" Apr 22 19:26:25.004271 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:25.004246 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn"] Apr 22 19:26:25.007610 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:25.007578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea4f413_e0ef_4b63_9f20_54a4234931a6.slice/crio-7e776cca546064ace3fdd81df9e8ead05c0b580ba4321ab57a598f7f55a4ec7c WatchSource:0}: Error finding container 7e776cca546064ace3fdd81df9e8ead05c0b580ba4321ab57a598f7f55a4ec7c: Status 404 returned error can't find the container with id 7e776cca546064ace3fdd81df9e8ead05c0b580ba4321ab57a598f7f55a4ec7c Apr 22 19:26:25.100554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:25.100522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" event={"ID":"6ea4f413-e0ef-4b63-9f20-54a4234931a6","Type":"ContainerStarted","Data":"7e776cca546064ace3fdd81df9e8ead05c0b580ba4321ab57a598f7f55a4ec7c"} Apr 22 19:26:25.383952 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:25.383918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:25.384318 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:25.384044 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:26:25.384318 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:25.384096 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls podName:a7f581bb-b565-4a28-8445-ef068ddffac6 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:26.38408367 +0000 UTC m=+136.169073521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kthnt" (UID: "a7f581bb-b565-4a28-8445-ef068ddffac6") : secret "samples-operator-tls" not found Apr 22 19:26:26.392331 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.392297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:26.392815 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:26.392436 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:26:26.392815 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:26.392534 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls podName:a7f581bb-b565-4a28-8445-ef068ddffac6 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:28.392492875 +0000 UTC m=+138.177482725 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kthnt" (UID: "a7f581bb-b565-4a28-8445-ef068ddffac6") : secret "samples-operator-tls" not found Apr 22 19:26:26.574405 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.574360 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tzsmr"] Apr 22 19:26:26.576882 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.576852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.579232 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.579205 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:26:26.579589 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.579572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:26:26.579589 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.579583 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:26:26.580452 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.580438 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:26.580541 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.580516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hfxnv\"" Apr 22 19:26:26.586647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.586629 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:26:26.587186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.587167 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tzsmr"] Apr 22 19:26:26.677583 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.677556 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs"] Apr 22 19:26:26.679446 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.679432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.681848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.681829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:26:26.681970 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.681941 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:26:26.681970 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.681958 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:26:26.682086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.681957 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-sgdq4\"" Apr 22 19:26:26.682086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.682059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:26:26.689408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.689390 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs"] Apr 22 19:26:26.694721 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.694700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvxp\" (UniqueName: \"kubernetes.io/projected/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-kube-api-access-lnvxp\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.694832 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.694764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-trusted-ca\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.694832 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.694793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-config\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.694932 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.694842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-serving-cert\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.796174 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-config\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.796341 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e967fc-c463-4f10-9b81-910499e78afc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.796341 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdlc\" (UniqueName: \"kubernetes.io/projected/89e967fc-c463-4f10-9b81-910499e78afc-kube-api-access-gpdlc\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.796341 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvxp\" (UniqueName: \"kubernetes.io/projected/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-kube-api-access-lnvxp\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.796455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-trusted-ca\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.796455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-serving-cert\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.796455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e967fc-c463-4f10-9b81-910499e78afc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.796834 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.796814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-config\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.797132 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.797115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-trusted-ca\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.798681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.798666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-serving-cert\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.804516 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.804485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvxp\" (UniqueName: \"kubernetes.io/projected/2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4-kube-api-access-lnvxp\") pod \"console-operator-9d4b6777b-tzsmr\" (UID: \"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4\") " pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.885843 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.885763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:26.897642 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.897616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdlc\" (UniqueName: \"kubernetes.io/projected/89e967fc-c463-4f10-9b81-910499e78afc-kube-api-access-gpdlc\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.897782 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.897732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e967fc-c463-4f10-9b81-910499e78afc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.897849 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.897835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e967fc-c463-4f10-9b81-910499e78afc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.898484 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.898418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89e967fc-c463-4f10-9b81-910499e78afc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.900302 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.900280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89e967fc-c463-4f10-9b81-910499e78afc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.905525 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.905491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdlc\" (UniqueName: \"kubernetes.io/projected/89e967fc-c463-4f10-9b81-910499e78afc-kube-api-access-gpdlc\") pod \"kube-storage-version-migrator-operator-6769c5d45-788qs\" (UID: \"89e967fc-c463-4f10-9b81-910499e78afc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.987929 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.987899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" Apr 22 19:26:26.998322 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:26.998277 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-tzsmr"] Apr 22 19:26:27.000556 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:27.000524 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a2eeae7_3634_4f74_a8ab_198bbd3ed2a4.slice/crio-693a7e8ba0edf48303ca5b83f32706c08bb56f29e2c393d3de5e6fcaefc252f0 WatchSource:0}: Error finding container 693a7e8ba0edf48303ca5b83f32706c08bb56f29e2c393d3de5e6fcaefc252f0: Status 404 returned error can't find the container with id 693a7e8ba0edf48303ca5b83f32706c08bb56f29e2c393d3de5e6fcaefc252f0 Apr 22 19:26:27.100853 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:27.100815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs"] Apr 22 19:26:27.104532 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:27.104481 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e967fc_c463_4f10_9b81_910499e78afc.slice/crio-e64a53eb815c32b73051f29a8c1f57238bc6a748f14364c1080fed632c82703b WatchSource:0}: Error finding container e64a53eb815c32b73051f29a8c1f57238bc6a748f14364c1080fed632c82703b: Status 404 returned error can't find the container with id e64a53eb815c32b73051f29a8c1f57238bc6a748f14364c1080fed632c82703b Apr 22 19:26:27.106002 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:27.105860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" event={"ID":"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4","Type":"ContainerStarted","Data":"693a7e8ba0edf48303ca5b83f32706c08bb56f29e2c393d3de5e6fcaefc252f0"} Apr 22 19:26:27.107396 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:27.107375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" event={"ID":"6ea4f413-e0ef-4b63-9f20-54a4234931a6","Type":"ContainerStarted","Data":"483142c09a58621b92c07868e2d1419b9664a4c3b54b41b5745e8a08b091f6e0"} Apr 22 19:26:27.123045 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:27.123006 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dj2zn" podStartSLOduration=1.910155407 podStartE2EDuration="3.122994671s" podCreationTimestamp="2026-04-22 19:26:24 +0000 UTC" firstStartedPulling="2026-04-22 19:26:25.009154759 +0000 UTC m=+134.794144609" lastFinishedPulling="2026-04-22 19:26:26.221994004 +0000 UTC m=+136.006983873" observedRunningTime="2026-04-22 19:26:27.122246172 +0000 UTC m=+136.907236045" watchObservedRunningTime="2026-04-22 19:26:27.122994671 +0000 UTC m=+136.907984542" Apr 22 19:26:28.110909 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:28.110869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" event={"ID":"89e967fc-c463-4f10-9b81-910499e78afc","Type":"ContainerStarted","Data":"e64a53eb815c32b73051f29a8c1f57238bc6a748f14364c1080fed632c82703b"} Apr 22 19:26:28.411434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:28.411347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:28.411622 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:28.411512 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:26:28.411622 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:28.411580 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls podName:a7f581bb-b565-4a28-8445-ef068ddffac6 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:32.411558596 +0000 UTC m=+142.196548454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kthnt" (UID: "a7f581bb-b565-4a28-8445-ef068ddffac6") : secret "samples-operator-tls" not found Apr 22 19:26:30.121465 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.121424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" event={"ID":"89e967fc-c463-4f10-9b81-910499e78afc","Type":"ContainerStarted","Data":"d30cd0dfd91b9f6d24ff4b9688815f31a7552930b485fc39d2aa71b173598e20"} Apr 22 19:26:30.123138 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.123116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/0.log" Apr 22 19:26:30.123264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.123154 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4" containerID="c86c11a2264f8ed4b878f61ff025c9cfce0adfb67551d5bd48c384ac5d9db541" exitCode=255 Apr 22 19:26:30.123264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.123201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" event={"ID":"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4","Type":"ContainerDied","Data":"c86c11a2264f8ed4b878f61ff025c9cfce0adfb67551d5bd48c384ac5d9db541"} Apr 22 19:26:30.123416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.123401 2576 scope.go:117] "RemoveContainer" containerID="c86c11a2264f8ed4b878f61ff025c9cfce0adfb67551d5bd48c384ac5d9db541" Apr 22 19:26:30.139878 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:30.139816 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" podStartSLOduration=1.9229883989999998 podStartE2EDuration="4.139797083s" podCreationTimestamp="2026-04-22 19:26:26 +0000 UTC" firstStartedPulling="2026-04-22 19:26:27.106434437 +0000 UTC m=+136.891424288" lastFinishedPulling="2026-04-22 19:26:29.32324311 +0000 UTC m=+139.108232972" observedRunningTime="2026-04-22 19:26:30.138818692 +0000 UTC m=+139.923808573" watchObservedRunningTime="2026-04-22 19:26:30.139797083 +0000 UTC m=+139.924786951" Apr 22 19:26:31.126871 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.126845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:26:31.127242 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.127220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/0.log" Apr 22 19:26:31.127286 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.127255 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4" containerID="d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291" exitCode=255 Apr 22 19:26:31.127357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.127337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" event={"ID":"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4","Type":"ContainerDied","Data":"d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291"} Apr 22 19:26:31.127395 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.127376 2576 scope.go:117] "RemoveContainer" containerID="c86c11a2264f8ed4b878f61ff025c9cfce0adfb67551d5bd48c384ac5d9db541" Apr 22 19:26:31.127620 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:31.127601 2576 scope.go:117] "RemoveContainer" containerID="d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291" Apr 22 19:26:31.127812 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:31.127791 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tzsmr_openshift-console-operator(2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" podUID="2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4" Apr 22 19:26:32.130828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.130799 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:26:32.131193 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.131127 2576 scope.go:117] "RemoveContainer" containerID="d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291" Apr 22 19:26:32.131302 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.131285 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tzsmr_openshift-console-operator(2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" podUID="2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4" Apr 22 19:26:32.390787 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.390707 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gh87h"] Apr 22 19:26:32.392673 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.392657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.395183 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.395162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:26:32.395307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.395184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:26:32.395307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.395215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:26:32.395307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.395164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lnx8j\"" Apr 22 19:26:32.395975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.395962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:26:32.400706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.400685 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gh87h"] Apr 22 19:26:32.446019 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.445994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:32.446191 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.446061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:32.446191 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446133 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:32.446191 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.446150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:32.446191 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.446171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:32.446191 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446133 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:26:32.446416 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446219 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:26:32.446416 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446193 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls podName:d6b05d52-0d1e-4259-a36b-3b1d0e753715 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:48.446179532 +0000 UTC m=+158.231169381 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-glpd4" (UID: "d6b05d52-0d1e-4259-a36b-3b1d0e753715") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:26:32.446416 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446269 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:48.446249528 +0000 UTC m=+158.231239392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : secret "router-metrics-certs-default" not found Apr 22 19:26:32.446416 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls podName:a7f581bb-b565-4a28-8445-ef068ddffac6 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:40.446279798 +0000 UTC m=+150.231269651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-kthnt" (UID: "a7f581bb-b565-4a28-8445-ef068ddffac6") : secret "samples-operator-tls" not found Apr 22 19:26:32.446416 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:32.446311 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle podName:9a37ca65-2a95-4238-9686-942fb21e4095 nodeName:}" failed. No retries permitted until 2026-04-22 19:26:48.44630129 +0000 UTC m=+158.231291146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle") pod "router-default-79d8474b76-rbnlg" (UID: "9a37ca65-2a95-4238-9686-942fb21e4095") : configmap references non-existent config key: service-ca.crt Apr 22 19:26:32.547658 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.547625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-cabundle\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.547819 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.547685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-key\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.547819 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.547753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjc7\" (UniqueName: \"kubernetes.io/projected/fd255e08-dbd3-44e8-85f7-f86a24fea88f-kube-api-access-bpjc7\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.649151 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.649058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-key\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.649151 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.649115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpjc7\" (UniqueName: \"kubernetes.io/projected/fd255e08-dbd3-44e8-85f7-f86a24fea88f-kube-api-access-bpjc7\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.649338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.649249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-cabundle\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.649863 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.649841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-cabundle\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.651478 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.651454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fd255e08-dbd3-44e8-85f7-f86a24fea88f-signing-key\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.658867 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.658848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpjc7\" (UniqueName: \"kubernetes.io/projected/fd255e08-dbd3-44e8-85f7-f86a24fea88f-kube-api-access-bpjc7\") pod \"service-ca-865cb79987-gh87h\" (UID: \"fd255e08-dbd3-44e8-85f7-f86a24fea88f\") " pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.701796 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.701761 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-gh87h" Apr 22 19:26:32.815567 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:32.815533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-gh87h"] Apr 22 19:26:32.819511 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:32.819462 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd255e08_dbd3_44e8_85f7_f86a24fea88f.slice/crio-88894300e069cf9337804aecce2462708781651cbffbedc52128ab93ff389fca WatchSource:0}: Error finding container 88894300e069cf9337804aecce2462708781651cbffbedc52128ab93ff389fca: Status 404 returned error can't find the container with id 88894300e069cf9337804aecce2462708781651cbffbedc52128ab93ff389fca Apr 22 19:26:33.134411 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:33.134376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gh87h" event={"ID":"fd255e08-dbd3-44e8-85f7-f86a24fea88f","Type":"ContainerStarted","Data":"88894300e069cf9337804aecce2462708781651cbffbedc52128ab93ff389fca"} Apr 22 19:26:35.139771 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:35.139735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-gh87h" event={"ID":"fd255e08-dbd3-44e8-85f7-f86a24fea88f","Type":"ContainerStarted","Data":"43d817197f0393d02dafaae115d6e50f479204f7dae467504b17c4fe9c5b6660"} Apr 22 19:26:35.159468 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:35.159414 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-gh87h" podStartSLOduration=1.638722262 podStartE2EDuration="3.159395883s" podCreationTimestamp="2026-04-22 19:26:32 +0000 UTC" firstStartedPulling="2026-04-22 19:26:32.821311513 +0000 UTC m=+142.606301363" lastFinishedPulling="2026-04-22 19:26:34.34198512 +0000 UTC m=+144.126974984" observedRunningTime="2026-04-22 19:26:35.157815762 +0000 UTC m=+144.942805633" watchObservedRunningTime="2026-04-22 19:26:35.159395883 +0000 UTC m=+144.944385755" Apr 22 19:26:36.886870 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:36.886834 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:36.887201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:36.886886 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:36.887350 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:36.887333 2576 scope.go:117] "RemoveContainer" containerID="d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291" Apr 22 19:26:36.887585 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:36.887563 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-tzsmr_openshift-console-operator(2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" podUID="2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4" Apr 22 19:26:40.515516 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:40.515461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:40.518015 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:40.517986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7f581bb-b565-4a28-8445-ef068ddffac6-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-kthnt\" (UID: \"a7f581bb-b565-4a28-8445-ef068ddffac6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:40.596340 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:40.596306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" Apr 22 19:26:40.732101 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:40.732054 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt"] Apr 22 19:26:41.154879 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:41.154799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" event={"ID":"a7f581bb-b565-4a28-8445-ef068ddffac6","Type":"ContainerStarted","Data":"3b030452a4deddecbe5453db3d76bc51823d9a4272bd09001a3aa3ca8504a5f9"} Apr 22 19:26:43.161776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:43.161739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" event={"ID":"a7f581bb-b565-4a28-8445-ef068ddffac6","Type":"ContainerStarted","Data":"167d1933bcf88b7dcc3f9757aa40bfab43f165a3202b2ae9b313f3b7e4168570"} Apr 22 19:26:43.161776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:43.161778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" event={"ID":"a7f581bb-b565-4a28-8445-ef068ddffac6","Type":"ContainerStarted","Data":"5f2135b191cb196ace10c7bfab6d843355d8ac9e27dcc2e98b73a06df305735e"} Apr 22 19:26:43.179639 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:43.179581 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-kthnt" podStartSLOduration=17.412229249 podStartE2EDuration="19.179562943s" podCreationTimestamp="2026-04-22 19:26:24 +0000 UTC" firstStartedPulling="2026-04-22 19:26:40.771260335 +0000 UTC m=+150.556250189" lastFinishedPulling="2026-04-22 19:26:42.538594019 +0000 UTC m=+152.323583883" observedRunningTime="2026-04-22 19:26:43.178248335 +0000 UTC m=+152.963238220" watchObservedRunningTime="2026-04-22 19:26:43.179562943 +0000 UTC m=+152.964552816" Apr 22 19:26:46.079174 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:46.079123 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9tp5r" podUID="15547e0d-8f10-470f-a80b-0cb53add2696" Apr 22 19:26:46.085286 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:46.085258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hhwkd" podUID="f8046910-c087-4b0d-a917-3216261f41d0" Apr 22 19:26:46.168606 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:46.168579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:26:46.168766 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:46.168579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhwkd" Apr 22 19:26:47.760221 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:26:47.760167 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rqq85" podUID="342209dc-2f51-4fc2-a96f-a19424f86d57" Apr 22 19:26:48.478574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.478541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:48.478828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.478583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:48.478828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.478629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:48.479295 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.479268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a37ca65-2a95-4238-9686-942fb21e4095-service-ca-bundle\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:48.480986 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.480964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9a37ca65-2a95-4238-9686-942fb21e4095-metrics-certs\") pod \"router-default-79d8474b76-rbnlg\" (UID: \"9a37ca65-2a95-4238-9686-942fb21e4095\") " pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:48.481098 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.480971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6b05d52-0d1e-4259-a36b-3b1d0e753715-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-glpd4\" (UID: \"d6b05d52-0d1e-4259-a36b-3b1d0e753715\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:48.681701 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.681665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" Apr 22 19:26:48.686488 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.686460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:48.754596 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.754570 2576 scope.go:117] "RemoveContainer" containerID="d490bc0ce71d1cd7c1d5629b56411ea72272e9b388c37c92df6249e30aa41291" Apr 22 19:26:48.804651 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.804602 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4"] Apr 22 19:26:48.808919 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:48.808893 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b05d52_0d1e_4259_a36b_3b1d0e753715.slice/crio-d6abff1d04450e610f9086c891f3c59abea2cf013d6d64bbc8af3f765896365d WatchSource:0}: Error finding container d6abff1d04450e610f9086c891f3c59abea2cf013d6d64bbc8af3f765896365d: Status 404 returned error can't find the container with id d6abff1d04450e610f9086c891f3c59abea2cf013d6d64bbc8af3f765896365d Apr 22 19:26:48.824641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:48.824616 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-79d8474b76-rbnlg"] Apr 22 19:26:48.827529 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:48.827494 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a37ca65_2a95_4238_9686_942fb21e4095.slice/crio-fa9f09ebf583c168852e551618c2fd8388773b7b1fe1af7291079375e592da29 WatchSource:0}: Error finding container fa9f09ebf583c168852e551618c2fd8388773b7b1fe1af7291079375e592da29: Status 404 returned error can't find the container with id fa9f09ebf583c168852e551618c2fd8388773b7b1fe1af7291079375e592da29 Apr 22 19:26:49.176579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.176541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" event={"ID":"d6b05d52-0d1e-4259-a36b-3b1d0e753715","Type":"ContainerStarted","Data":"d6abff1d04450e610f9086c891f3c59abea2cf013d6d64bbc8af3f765896365d"} Apr 22 19:26:49.177985 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.177955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79d8474b76-rbnlg" event={"ID":"9a37ca65-2a95-4238-9686-942fb21e4095","Type":"ContainerStarted","Data":"65383cdf602ec6fba0b823490984020339592f5de7902641421fa24ffcb08c86"} Apr 22 19:26:49.178107 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.177991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79d8474b76-rbnlg" event={"ID":"9a37ca65-2a95-4238-9686-942fb21e4095","Type":"ContainerStarted","Data":"fa9f09ebf583c168852e551618c2fd8388773b7b1fe1af7291079375e592da29"} Apr 22 19:26:49.179734 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.179716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:26:49.179852 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.179762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" event={"ID":"2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4","Type":"ContainerStarted","Data":"19c900415091eeba647a87cdb16f93115413b7a33296bba093b187c99e2ad952"} Apr 22 19:26:49.179997 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.179961 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:49.185326 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.185303 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" Apr 22 19:26:49.199402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.199358 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79d8474b76-rbnlg" podStartSLOduration=33.199345723 podStartE2EDuration="33.199345723s" podCreationTimestamp="2026-04-22 19:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:49.198700728 +0000 UTC m=+158.983690618" watchObservedRunningTime="2026-04-22 19:26:49.199345723 +0000 UTC m=+158.984335594" Apr 22 19:26:49.216633 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.216583 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-tzsmr" podStartSLOduration=20.897621849 podStartE2EDuration="23.216566436s" podCreationTimestamp="2026-04-22 19:26:26 +0000 UTC" firstStartedPulling="2026-04-22 19:26:27.002306833 +0000 UTC m=+136.787296684" lastFinishedPulling="2026-04-22 19:26:29.321251418 +0000 UTC m=+139.106241271" observedRunningTime="2026-04-22 19:26:49.215536229 +0000 UTC m=+159.000526101" watchObservedRunningTime="2026-04-22 19:26:49.216566436 +0000 UTC m=+159.001556310" Apr 22 19:26:49.687656 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.687620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:49.690556 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:49.690523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:50.183325 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:50.183294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" event={"ID":"d6b05d52-0d1e-4259-a36b-3b1d0e753715","Type":"ContainerStarted","Data":"8fab23d1adbee21a575d8f9ce1269d9aa839fe6f9d027170b2b83f5830036681"} Apr 22 19:26:50.183670 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:50.183622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:50.184818 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:50.184798 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79d8474b76-rbnlg" Apr 22 19:26:50.204574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:50.204489 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-glpd4" podStartSLOduration=32.89786166 podStartE2EDuration="34.204471385s" podCreationTimestamp="2026-04-22 19:26:16 +0000 UTC" firstStartedPulling="2026-04-22 19:26:48.81075835 +0000 UTC m=+158.595748200" lastFinishedPulling="2026-04-22 19:26:50.117368071 +0000 UTC m=+159.902357925" observedRunningTime="2026-04-22 19:26:50.202471234 +0000 UTC m=+159.987461109" watchObservedRunningTime="2026-04-22 19:26:50.204471385 +0000 UTC m=+159.989461258" Apr 22 19:26:51.001730 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.001697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:26:51.001885 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.001739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:26:51.004099 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.004067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8046910-c087-4b0d-a917-3216261f41d0-metrics-tls\") pod \"dns-default-hhwkd\" (UID: \"f8046910-c087-4b0d-a917-3216261f41d0\") " pod="openshift-dns/dns-default-hhwkd" Apr 22 19:26:51.004224 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.004106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15547e0d-8f10-470f-a80b-0cb53add2696-cert\") pod \"ingress-canary-9tp5r\" (UID: \"15547e0d-8f10-470f-a80b-0cb53add2696\") " pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:26:51.271429 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.271400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jd9zz\"" Apr 22 19:26:51.271827 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.271412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76gtl\"" Apr 22 19:26:51.279933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.279910 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9tp5r" Apr 22 19:26:51.279933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.279930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhwkd" Apr 22 19:26:51.425257 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.425228 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhwkd"] Apr 22 19:26:51.428146 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:51.428113 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8046910_c087_4b0d_a917_3216261f41d0.slice/crio-c91c61dae1f7ff1ef58b31db045c89f49ee4d556e7ee32ee456a0819e49587cc WatchSource:0}: Error finding container c91c61dae1f7ff1ef58b31db045c89f49ee4d556e7ee32ee456a0819e49587cc: Status 404 returned error can't find the container with id c91c61dae1f7ff1ef58b31db045c89f49ee4d556e7ee32ee456a0819e49587cc Apr 22 19:26:51.440214 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:51.440194 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9tp5r"] Apr 22 19:26:51.443603 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:51.443578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15547e0d_8f10_470f_a80b_0cb53add2696.slice/crio-659f5009b481af8a56b1a2083e8262ac37435b64b2ad1b9a541d0d70f2473a20 WatchSource:0}: Error finding container 659f5009b481af8a56b1a2083e8262ac37435b64b2ad1b9a541d0d70f2473a20: Status 404 returned error can't find the container with id 659f5009b481af8a56b1a2083e8262ac37435b64b2ad1b9a541d0d70f2473a20 Apr 22 19:26:52.191358 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:52.191301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9tp5r" event={"ID":"15547e0d-8f10-470f-a80b-0cb53add2696","Type":"ContainerStarted","Data":"659f5009b481af8a56b1a2083e8262ac37435b64b2ad1b9a541d0d70f2473a20"} Apr 22 19:26:52.192647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:52.192614 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhwkd" event={"ID":"f8046910-c087-4b0d-a917-3216261f41d0","Type":"ContainerStarted","Data":"c91c61dae1f7ff1ef58b31db045c89f49ee4d556e7ee32ee456a0819e49587cc"} Apr 22 19:26:53.294036 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.293959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8gg6h"] Apr 22 19:26:53.305646 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.305624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.308475 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.308452 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gg6h"] Apr 22 19:26:53.309789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.309766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr677\"" Apr 22 19:26:53.309925 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.309906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:26:53.309977 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.309961 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:26:53.403434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.403411 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-dtw9f"] Apr 22 19:26:53.405773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.405751 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-554b658566-jfndr"] Apr 22 19:26:53.406006 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.405975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:26:53.407583 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.407564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.408582 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.408560 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dvtdz\"" Apr 22 19:26:53.408685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.408594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:26:53.408863 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.408845 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:26:53.410701 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.410680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lccrt\"" Apr 22 19:26:53.410800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.410753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:26:53.411096 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.410945 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:26:53.411225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.411206 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:26:53.416740 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.416717 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:26:53.417920 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.417901 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dtw9f"] Apr 22 19:26:53.425984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.425959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52b31b6c-6072-4c90-8348-9510ed167d08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.426097 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.426014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52b31b6c-6072-4c90-8348-9510ed167d08-data-volume\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.426097 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.426078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qmz\" (UniqueName: \"kubernetes.io/projected/52b31b6c-6072-4c90-8348-9510ed167d08-kube-api-access-s7qmz\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.426210 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.426116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52b31b6c-6072-4c90-8348-9510ed167d08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.426210 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.426156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52b31b6c-6072-4c90-8348-9510ed167d08-crio-socket\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.431349 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.431326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554b658566-jfndr"] Apr 22 19:26:53.527649 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.526886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-image-registry-private-configuration\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.527803 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.527658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52b31b6c-6072-4c90-8348-9510ed167d08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.527803 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.527720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-trusted-ca\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.528485 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgmc5\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-kube-api-access-xgmc5\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.528485 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/52b31b6c-6072-4c90-8348-9510ed167d08-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.528647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-installation-pull-secrets\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.528647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52b31b6c-6072-4c90-8348-9510ed167d08-data-volume\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.528647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-bound-sa-token\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.528794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qmz\" (UniqueName: \"kubernetes.io/projected/52b31b6c-6072-4c90-8348-9510ed167d08-kube-api-access-s7qmz\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.528794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-ca-trust-extracted\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.528794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zns5f\" (UniqueName: \"kubernetes.io/projected/a58a96e8-098b-416a-9c25-554a1abb6b1d-kube-api-access-zns5f\") pod \"downloads-6bcc868b7-dtw9f\" (UID: \"a58a96e8-098b-416a-9c25-554a1abb6b1d\") " pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:26:53.528794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52b31b6c-6072-4c90-8348-9510ed167d08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.528794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-tls\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.529037 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52b31b6c-6072-4c90-8348-9510ed167d08-crio-socket\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.529037 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-certificates\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.529037 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/52b31b6c-6072-4c90-8348-9510ed167d08-data-volume\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.529037 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.528963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/52b31b6c-6072-4c90-8348-9510ed167d08-crio-socket\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.531956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.531899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/52b31b6c-6072-4c90-8348-9510ed167d08-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.541041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.541016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qmz\" (UniqueName: \"kubernetes.io/projected/52b31b6c-6072-4c90-8348-9510ed167d08-kube-api-access-s7qmz\") pod \"insights-runtime-extractor-8gg6h\" (UID: \"52b31b6c-6072-4c90-8348-9510ed167d08\") " pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.616719 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.616554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gg6h" Apr 22 19:26:53.630135 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-ca-trust-extracted\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630135 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zns5f\" (UniqueName: \"kubernetes.io/projected/a58a96e8-098b-416a-9c25-554a1abb6b1d-kube-api-access-zns5f\") pod \"downloads-6bcc868b7-dtw9f\" (UID: \"a58a96e8-098b-416a-9c25-554a1abb6b1d\") " pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-tls\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-certificates\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-image-registry-private-configuration\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-trusted-ca\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgmc5\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-kube-api-access-xgmc5\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-installation-pull-secrets\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-bound-sa-token\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.630979 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.630528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-ca-trust-extracted\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.632420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.632363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-certificates\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.632544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.632437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-trusted-ca\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.633372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.633337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-image-registry-private-configuration\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.633482 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.633455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-registry-tls\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.633829 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.633808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-installation-pull-secrets\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.637714 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.637696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgmc5\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-kube-api-access-xgmc5\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.638315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.638297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zns5f\" (UniqueName: \"kubernetes.io/projected/a58a96e8-098b-416a-9c25-554a1abb6b1d-kube-api-access-zns5f\") pod \"downloads-6bcc868b7-dtw9f\" (UID: \"a58a96e8-098b-416a-9c25-554a1abb6b1d\") " pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:26:53.641223 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.641183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e-bound-sa-token\") pod \"image-registry-554b658566-jfndr\" (UID: \"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e\") " pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.731980 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.731943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gg6h"] Apr 22 19:26:53.734807 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.734779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:26:53.736105 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:53.736080 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b31b6c_6072_4c90_8348_9510ed167d08.slice/crio-e75dd9e97a8b676eb0952e2525585246e584dfce7e899edcabbe9cf1df7bfb0a WatchSource:0}: Error finding container e75dd9e97a8b676eb0952e2525585246e584dfce7e899edcabbe9cf1df7bfb0a: Status 404 returned error can't find the container with id e75dd9e97a8b676eb0952e2525585246e584dfce7e899edcabbe9cf1df7bfb0a Apr 22 19:26:53.739449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.739424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:53.871938 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.871906 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-dtw9f"] Apr 22 19:26:53.875075 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:53.875047 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58a96e8_098b_416a_9c25_554a1abb6b1d.slice/crio-392fd16b2f1ec56facf683ec35178a9ea5d275e16de12367e9371f22e90bd526 WatchSource:0}: Error finding container 392fd16b2f1ec56facf683ec35178a9ea5d275e16de12367e9371f22e90bd526: Status 404 returned error can't find the container with id 392fd16b2f1ec56facf683ec35178a9ea5d275e16de12367e9371f22e90bd526 Apr 22 19:26:53.888276 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:53.888252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554b658566-jfndr"] Apr 22 19:26:53.890963 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:26:53.890940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c00bd78_14f8_4c6f_b6e5_e536a5b57c3e.slice/crio-f375d77b90149a75dee0f1cc6f6ac075c2b5dcdcbb578f45fa3ece644774d6c6 WatchSource:0}: Error finding container f375d77b90149a75dee0f1cc6f6ac075c2b5dcdcbb578f45fa3ece644774d6c6: Status 404 returned error can't find the container with id f375d77b90149a75dee0f1cc6f6ac075c2b5dcdcbb578f45fa3ece644774d6c6 Apr 22 19:26:54.201453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.201360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhwkd" event={"ID":"f8046910-c087-4b0d-a917-3216261f41d0","Type":"ContainerStarted","Data":"a6174e2b045c470fd6267f73dcd1a3c40403097ad0c337062935e5b072e98906"} Apr 22 19:26:54.201453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.201403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhwkd" event={"ID":"f8046910-c087-4b0d-a917-3216261f41d0","Type":"ContainerStarted","Data":"21cfc799ce27a932cc162f7a4f1faef38c0e52ab98984a1d9f2d8b348b3ea969"} Apr 22 19:26:54.201696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.201523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hhwkd" Apr 22 19:26:54.202753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.202715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gg6h" event={"ID":"52b31b6c-6072-4c90-8348-9510ed167d08","Type":"ContainerStarted","Data":"fd4307e1bac7a46c5df3f87a7014ca236c82637d30a270be677996ac365b451f"} Apr 22 19:26:54.202753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.202743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gg6h" event={"ID":"52b31b6c-6072-4c90-8348-9510ed167d08","Type":"ContainerStarted","Data":"e75dd9e97a8b676eb0952e2525585246e584dfce7e899edcabbe9cf1df7bfb0a"} Apr 22 19:26:54.203851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.203816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554b658566-jfndr" event={"ID":"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e","Type":"ContainerStarted","Data":"7e3191abb0df451e045742c6f48f1de2773057fe5ce00e4aeb743dc7b47206e2"} Apr 22 19:26:54.203851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.203845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554b658566-jfndr" event={"ID":"2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e","Type":"ContainerStarted","Data":"f375d77b90149a75dee0f1cc6f6ac075c2b5dcdcbb578f45fa3ece644774d6c6"} Apr 22 19:26:54.204005 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.203964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:26:54.204887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.204868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dtw9f" event={"ID":"a58a96e8-098b-416a-9c25-554a1abb6b1d","Type":"ContainerStarted","Data":"392fd16b2f1ec56facf683ec35178a9ea5d275e16de12367e9371f22e90bd526"} Apr 22 19:26:54.206059 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.206041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9tp5r" event={"ID":"15547e0d-8f10-470f-a80b-0cb53add2696","Type":"ContainerStarted","Data":"0c1d79d222bd92e7179d8e4f11c93fd63e56a4ef8a947cfa696affdc6904ec14"} Apr 22 19:26:54.218002 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.217956 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hhwkd" podStartSLOduration=129.279996749 podStartE2EDuration="2m11.2179405s" podCreationTimestamp="2026-04-22 19:24:43 +0000 UTC" firstStartedPulling="2026-04-22 19:26:51.430156403 +0000 UTC m=+161.215146253" lastFinishedPulling="2026-04-22 19:26:53.368100145 +0000 UTC m=+163.153090004" observedRunningTime="2026-04-22 19:26:54.217428173 +0000 UTC m=+164.002418045" watchObservedRunningTime="2026-04-22 19:26:54.2179405 +0000 UTC m=+164.002930370" Apr 22 19:26:54.235186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.235145 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-554b658566-jfndr" podStartSLOduration=1.235134883 podStartE2EDuration="1.235134883s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:54.234528182 +0000 UTC m=+164.019518056" watchObservedRunningTime="2026-04-22 19:26:54.235134883 +0000 UTC m=+164.020124751" Apr 22 19:26:54.249443 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:54.249400 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9tp5r" podStartSLOduration=129.323375712 podStartE2EDuration="2m11.249384923s" podCreationTimestamp="2026-04-22 19:24:43 +0000 UTC" firstStartedPulling="2026-04-22 19:26:51.445198934 +0000 UTC m=+161.230188784" lastFinishedPulling="2026-04-22 19:26:53.371208132 +0000 UTC m=+163.156197995" observedRunningTime="2026-04-22 19:26:54.248965006 +0000 UTC m=+164.033954892" watchObservedRunningTime="2026-04-22 19:26:54.249384923 +0000 UTC m=+164.034374796" Apr 22 19:26:55.211230 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:55.210840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gg6h" event={"ID":"52b31b6c-6072-4c90-8348-9510ed167d08","Type":"ContainerStarted","Data":"de81bd5e27e1fb6b16758e452ee33aab669cdd5cb6e6a52ef4aa41ab687d9cb7"} Apr 22 19:26:56.216179 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:56.216140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gg6h" event={"ID":"52b31b6c-6072-4c90-8348-9510ed167d08","Type":"ContainerStarted","Data":"d4e3003ad6813d5e5bb4ea2805a592f07683d086c980105765a877fe5dd741f4"} Apr 22 19:26:56.243352 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:56.243296 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8gg6h" podStartSLOduration=1.017841637 podStartE2EDuration="3.243278239s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="2026-04-22 19:26:53.808971382 +0000 UTC m=+163.593961232" lastFinishedPulling="2026-04-22 19:26:56.034407981 +0000 UTC m=+165.819397834" observedRunningTime="2026-04-22 19:26:56.242939019 +0000 UTC m=+166.027928892" watchObservedRunningTime="2026-04-22 19:26:56.243278239 +0000 UTC m=+166.028268111" Apr 22 19:26:58.750437 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:26:58.750398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:27:00.226522 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.226477 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:00.251795 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.251767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:00.251967 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.251890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.255050 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.255028 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:27:00.255947 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.255920 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:27:00.256056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.255929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:27:00.256185 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.256164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-slxkc\"" Apr 22 19:27:00.256270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.256223 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:27:00.256405 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.256390 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:27:00.385900 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.385870 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.386054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.385917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.386054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.386010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.386138 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.386082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.386138 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.386107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.386138 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.386125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9crms\" (UniqueName: \"kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.486922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.486960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.486976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9crms\" (UniqueName: \"kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487805 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487931 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.487999 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.487966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.489696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.489646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.489925 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.489906 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.495230 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.495208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9crms\" (UniqueName: \"kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms\") pod \"console-f5f455d4f-vs2h4\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.562686 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.562641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:00.697895 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:00.697868 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:00.700795 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:27:00.700754 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ae80e_54eb_4ea7_b6b1_cadca67fc9c3.slice/crio-3c2c45d11b96bfb356a9d0c6981e0568f6b93ede60e759a90071ad9e33275415 WatchSource:0}: Error finding container 3c2c45d11b96bfb356a9d0c6981e0568f6b93ede60e759a90071ad9e33275415: Status 404 returned error can't find the container with id 3c2c45d11b96bfb356a9d0c6981e0568f6b93ede60e759a90071ad9e33275415 Apr 22 19:27:01.232428 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:01.232379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f5f455d4f-vs2h4" event={"ID":"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3","Type":"ContainerStarted","Data":"3c2c45d11b96bfb356a9d0c6981e0568f6b93ede60e759a90071ad9e33275415"} Apr 22 19:27:02.109012 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.108765 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-4d486"] Apr 22 19:27:02.111833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.111812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.117144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.116965 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-b8znt\"" Apr 22 19:27:02.121801 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.121220 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:27:02.129617 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.127038 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:27:02.129617 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.127249 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:27:02.132317 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.129860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:27:02.222426 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-root\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42r4\" (UniqueName: \"kubernetes.io/projected/b9bf8578-aad6-4b99-a551-6f6e222655c1-kube-api-access-w42r4\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-sys\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-wtmp\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-metrics-client-ca\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.222848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.222722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-textfile\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.323971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-wtmp\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-metrics-client-ca\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-textfile\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-root\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w42r4\" (UniqueName: \"kubernetes.io/projected/b9bf8578-aad6-4b99-a551-6f6e222655c1-kube-api-access-w42r4\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-sys\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.324700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.325571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.326365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-textfile\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.326571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-wtmp\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:27:02.326678 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:27:02.326736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls podName:b9bf8578-aad6-4b99-a551-6f6e222655c1 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:02.826717459 +0000 UTC m=+172.611707309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls") pod "node-exporter-4d486" (UID: "b9bf8578-aad6-4b99-a551-6f6e222655c1") : secret "node-exporter-tls" not found Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.327096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bf8578-aad6-4b99-a551-6f6e222655c1-metrics-client-ca\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.327166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-root\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.330420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.327210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9bf8578-aad6-4b99-a551-6f6e222655c1-sys\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.335202 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.335151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.352935 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.349994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w42r4\" (UniqueName: \"kubernetes.io/projected/b9bf8578-aad6-4b99-a551-6f6e222655c1-kube-api-access-w42r4\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.830515 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.830462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:02.833486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:02.833458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9bf8578-aad6-4b99-a551-6f6e222655c1-node-exporter-tls\") pod \"node-exporter-4d486\" (UID: \"b9bf8578-aad6-4b99-a551-6f6e222655c1\") " pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:03.033530 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.033473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-4d486" Apr 22 19:27:03.204336 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.204249 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:03.207409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.207380 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.209895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210356 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210449 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:27:03.210789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.210772 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-pj4lg\"" Apr 22 19:27:03.211449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.211430 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:27:03.211682 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.211662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:27:03.226392 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.226356 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:03.336481 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-299gv\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.336954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.337550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.336981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.337550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.337021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437480 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437678 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-299gv\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.437957 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.437699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438472 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:27:03.438805 2576 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:27:03.438865 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls podName:cbc20430-c16b-4431-b863-00fc63370fb0 nodeName:}" failed. No retries permitted until 2026-04-22 19:27:03.938847161 +0000 UTC m=+173.723837014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0") : secret "alertmanager-main-tls" not found Apr 22 19:27:03.439338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.438925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.440637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.440588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.441595 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.441560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.441770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.441736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.442272 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.442228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.442364 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.442335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.442417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.442383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.443514 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.443463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.443922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.443899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.444010 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.443922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.449604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.449578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-299gv\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.943519 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.943473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:03.946360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:03.946335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:04.120641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:04.120605 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:27:04.214430 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:04.214360 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hhwkd" Apr 22 19:27:06.408430 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.408384 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-5f4fd5879d-4728c"] Apr 22 19:27:06.411448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.411424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.414201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.414178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 19:27:06.415374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.415334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3vsq9ol5a0bcr\"" Apr 22 19:27:06.415374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.415354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 19:27:06.415575 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.415387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 19:27:06.415575 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.415398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-dtkqf\"" Apr 22 19:27:06.415575 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.415334 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 19:27:06.421816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.421788 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f4fd5879d-4728c"] Apr 22 19:27:06.568329 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwncb\" (UniqueName: \"kubernetes.io/projected/5f026615-7d00-4280-9da6-68e7d4f3e23d-kube-api-access-wwncb\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568524 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-client-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568524 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568384 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f026615-7d00-4280-9da6-68e7d4f3e23d-audit-log\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568524 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-tls\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-client-certs\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.568798 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.568724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-metrics-server-audit-profiles\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670065 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.669988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwncb\" (UniqueName: \"kubernetes.io/projected/5f026615-7d00-4280-9da6-68e7d4f3e23d-kube-api-access-wwncb\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670065 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-client-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670283 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f026615-7d00-4280-9da6-68e7d4f3e23d-audit-log\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670283 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-tls\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670283 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-client-certs\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670283 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670283 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-metrics-server-audit-profiles\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.670625 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.670601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/5f026615-7d00-4280-9da6-68e7d4f3e23d-audit-log\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.671101 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.671076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.671376 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.671352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/5f026615-7d00-4280-9da6-68e7d4f3e23d-metrics-server-audit-profiles\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.672999 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.672973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-client-ca-bundle\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.673639 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.673615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-client-certs\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.673826 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.673806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/5f026615-7d00-4280-9da6-68e7d4f3e23d-secret-metrics-server-tls\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.678216 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.678196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwncb\" (UniqueName: \"kubernetes.io/projected/5f026615-7d00-4280-9da6-68e7d4f3e23d-kube-api-access-wwncb\") pod \"metrics-server-5f4fd5879d-4728c\" (UID: \"5f026615-7d00-4280-9da6-68e7d4f3e23d\") " pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:06.723909 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:06.723875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:10.609718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.609684 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:27:10.612883 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.612861 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.621679 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.621451 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:27:10.623661 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.623634 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:27:10.710201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710371 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710684 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.710684 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.710480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbfzw\" (UniqueName: \"kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.811752 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.811937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.811937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.811937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.811937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbfzw\" (UniqueName: \"kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.812152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.811987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.812152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.812092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.812961 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.812892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.812961 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.812898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.813135 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.813019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.813666 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.813641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.814588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.814561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.814812 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.814793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.819940 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.819916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbfzw\" (UniqueName: \"kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw\") pod \"console-76c8d88db7-552xk\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.924764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:10.924683 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:10.950155 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:27:10.950110 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bf8578_aad6_4b99_a551_6f6e222655c1.slice/crio-60dd701ca9a9517456559ed18c01f9abdc03f98c4ce44c77b7015c27a1b7f076 WatchSource:0}: Error finding container 60dd701ca9a9517456559ed18c01f9abdc03f98c4ce44c77b7015c27a1b7f076: Status 404 returned error can't find the container with id 60dd701ca9a9517456559ed18c01f9abdc03f98c4ce44c77b7015c27a1b7f076 Apr 22 19:27:11.132669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.132627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:27:11.134738 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:27:11.134707 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc20430_c16b_4431_b863_00fc63370fb0.slice/crio-ef9e732b9c5f7f509f84f094d61148df186060afe7dc22266d993c3e5a2ff0b4 WatchSource:0}: Error finding container ef9e732b9c5f7f509f84f094d61148df186060afe7dc22266d993c3e5a2ff0b4: Status 404 returned error can't find the container with id ef9e732b9c5f7f509f84f094d61148df186060afe7dc22266d993c3e5a2ff0b4 Apr 22 19:27:11.265831 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.265790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-dtw9f" event={"ID":"a58a96e8-098b-416a-9c25-554a1abb6b1d","Type":"ContainerStarted","Data":"b66c2c9122fbacd0d605c8d69d83dabb035273dd29bc06ee8beac5a1c312cd63"} Apr 22 19:27:11.266530 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.266355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:27:11.267295 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.267269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f5f455d4f-vs2h4" event={"ID":"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3","Type":"ContainerStarted","Data":"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457"} Apr 22 19:27:11.268558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.268526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"ef9e732b9c5f7f509f84f094d61148df186060afe7dc22266d993c3e5a2ff0b4"} Apr 22 19:27:11.269627 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.269600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4d486" event={"ID":"b9bf8578-aad6-4b99-a551-6f6e222655c1","Type":"ContainerStarted","Data":"60dd701ca9a9517456559ed18c01f9abdc03f98c4ce44c77b7015c27a1b7f076"} Apr 22 19:27:11.278022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.277997 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-dtw9f" Apr 22 19:27:11.284983 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.284936 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-dtw9f" podStartSLOduration=1.152191127 podStartE2EDuration="18.284924456s" podCreationTimestamp="2026-04-22 19:26:53 +0000 UTC" firstStartedPulling="2026-04-22 19:26:53.87692216 +0000 UTC m=+163.661912011" lastFinishedPulling="2026-04-22 19:27:11.009655488 +0000 UTC m=+180.794645340" observedRunningTime="2026-04-22 19:27:11.283332197 +0000 UTC m=+181.068322070" watchObservedRunningTime="2026-04-22 19:27:11.284924456 +0000 UTC m=+181.069914328" Apr 22 19:27:11.320208 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.320155 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f5f455d4f-vs2h4" podStartSLOduration=1.069729926 podStartE2EDuration="11.320136798s" podCreationTimestamp="2026-04-22 19:27:00 +0000 UTC" firstStartedPulling="2026-04-22 19:27:00.703855942 +0000 UTC m=+170.488845792" lastFinishedPulling="2026-04-22 19:27:10.954262802 +0000 UTC m=+180.739252664" observedRunningTime="2026-04-22 19:27:11.318094036 +0000 UTC m=+181.103083909" watchObservedRunningTime="2026-04-22 19:27:11.320136798 +0000 UTC m=+181.105126686" Apr 22 19:27:11.351423 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.351382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:27:11.354131 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:11.354107 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-5f4fd5879d-4728c"] Apr 22 19:27:11.358263 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:27:11.358229 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7637f0b_c587_4c39_b962_34f0fe988685.slice/crio-eb34c9213292fdcbb9ced37714497ffa7412bc34944b9c3260405393f9918412 WatchSource:0}: Error finding container eb34c9213292fdcbb9ced37714497ffa7412bc34944b9c3260405393f9918412: Status 404 returned error can't find the container with id eb34c9213292fdcbb9ced37714497ffa7412bc34944b9c3260405393f9918412 Apr 22 19:27:11.359367 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:27:11.359331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f026615_7d00_4280_9da6_68e7d4f3e23d.slice/crio-4d73837b8f135195af184d814602b9c0e00f6930e8110d1a08bf364c3295f556 WatchSource:0}: Error finding container 4d73837b8f135195af184d814602b9c0e00f6930e8110d1a08bf364c3295f556: Status 404 returned error can't find the container with id 4d73837b8f135195af184d814602b9c0e00f6930e8110d1a08bf364c3295f556 Apr 22 19:27:12.274975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.274935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" event={"ID":"5f026615-7d00-4280-9da6-68e7d4f3e23d","Type":"ContainerStarted","Data":"4d73837b8f135195af184d814602b9c0e00f6930e8110d1a08bf364c3295f556"} Apr 22 19:27:12.277313 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.277230 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9bf8578-aad6-4b99-a551-6f6e222655c1" containerID="0bc8a9e4263d0a48f2a67a512db32a6e37ca5a0cdeb0e1b66625de5575ca3673" exitCode=0 Apr 22 19:27:12.277313 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.277311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4d486" event={"ID":"b9bf8578-aad6-4b99-a551-6f6e222655c1","Type":"ContainerDied","Data":"0bc8a9e4263d0a48f2a67a512db32a6e37ca5a0cdeb0e1b66625de5575ca3673"} Apr 22 19:27:12.281495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.281438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c8d88db7-552xk" event={"ID":"c7637f0b-c587-4c39-b962-34f0fe988685","Type":"ContainerStarted","Data":"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6"} Apr 22 19:27:12.281495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.281478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c8d88db7-552xk" event={"ID":"c7637f0b-c587-4c39-b962-34f0fe988685","Type":"ContainerStarted","Data":"eb34c9213292fdcbb9ced37714497ffa7412bc34944b9c3260405393f9918412"} Apr 22 19:27:12.318989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:12.318387 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c8d88db7-552xk" podStartSLOduration=2.318366944 podStartE2EDuration="2.318366944s" podCreationTimestamp="2026-04-22 19:27:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:12.317100215 +0000 UTC m=+182.102090101" watchObservedRunningTime="2026-04-22 19:27:12.318366944 +0000 UTC m=+182.103356846" Apr 22 19:27:13.285457 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:13.285417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4d486" event={"ID":"b9bf8578-aad6-4b99-a551-6f6e222655c1","Type":"ContainerStarted","Data":"5b698e4616079cd057f6dbbb02544521dec5ec4ab3b7a08e3c774272dececee0"} Apr 22 19:27:14.290672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.290628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-4d486" event={"ID":"b9bf8578-aad6-4b99-a551-6f6e222655c1","Type":"ContainerStarted","Data":"4a7cea09a3897296ecc2722e0c7f5eca4a149d37d6fcff326e4b1507c66c4712"} Apr 22 19:27:14.292165 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.292139 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="f9faa27f078dcb8ad93a769bf4f905bf652057f3df039fda9f43f86da55d00f7" exitCode=0 Apr 22 19:27:14.292290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.292208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"f9faa27f078dcb8ad93a769bf4f905bf652057f3df039fda9f43f86da55d00f7"} Apr 22 19:27:14.294000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.293975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" event={"ID":"5f026615-7d00-4280-9da6-68e7d4f3e23d","Type":"ContainerStarted","Data":"60fd4f5b3a6892931db0faa6b0fc1c3a1f791fae52e1a82d32a245f23b5f1d98"} Apr 22 19:27:14.316072 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.316026 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-4d486" podStartSLOduration=11.597389645 podStartE2EDuration="12.316010475s" podCreationTimestamp="2026-04-22 19:27:02 +0000 UTC" firstStartedPulling="2026-04-22 19:27:10.952307898 +0000 UTC m=+180.737297749" lastFinishedPulling="2026-04-22 19:27:11.670928722 +0000 UTC m=+181.455918579" observedRunningTime="2026-04-22 19:27:14.314522216 +0000 UTC m=+184.099512092" watchObservedRunningTime="2026-04-22 19:27:14.316010475 +0000 UTC m=+184.101000348" Apr 22 19:27:14.366708 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:14.366656 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" podStartSLOduration=6.275296548 podStartE2EDuration="8.366638704s" podCreationTimestamp="2026-04-22 19:27:06 +0000 UTC" firstStartedPulling="2026-04-22 19:27:11.361986229 +0000 UTC m=+181.146976094" lastFinishedPulling="2026-04-22 19:27:13.453328387 +0000 UTC m=+183.238318250" observedRunningTime="2026-04-22 19:27:14.36592641 +0000 UTC m=+184.150916282" watchObservedRunningTime="2026-04-22 19:27:14.366638704 +0000 UTC m=+184.151628578" Apr 22 19:27:15.217102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:15.217062 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-554b658566-jfndr" Apr 22 19:27:17.308171 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:17.308132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"70c44d14f7658093176dc1b552dded4d0ef4ae92f3599aa2a1f59039c546e351"} Apr 22 19:27:17.308725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:17.308180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"d41f41aa65bc2518836ed6bb60a93591906db7d1e49cbd1fe436b2d8ffdf437e"} Apr 22 19:27:17.308725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:17.308196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"1a73a52f9488a1a25bc92a89937146fea0f90f0971137fc37e9ce5528a46c07a"} Apr 22 19:27:17.308725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:17.308208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"1eb387f1cf203e38384ae0fcd0503c3869804f50c0037a396d86327ea02de862"} Apr 22 19:27:17.308725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:17.308216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"040f0de3f6f4da71f4a666958520c5e49c2640a10b7f0523a4ae0a80f3b57999"} Apr 22 19:27:19.319177 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:19.319141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerStarted","Data":"beb333d80105113a2c420b79f71f7f81d9de055f337e43c03baa7a0ec6408d6a"} Apr 22 19:27:19.350338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:19.350284 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=9.227695299 podStartE2EDuration="16.350265492s" podCreationTimestamp="2026-04-22 19:27:03 +0000 UTC" firstStartedPulling="2026-04-22 19:27:11.137530934 +0000 UTC m=+180.922520784" lastFinishedPulling="2026-04-22 19:27:18.260101124 +0000 UTC m=+188.045090977" observedRunningTime="2026-04-22 19:27:19.348874582 +0000 UTC m=+189.133864490" watchObservedRunningTime="2026-04-22 19:27:19.350265492 +0000 UTC m=+189.135255365" Apr 22 19:27:20.563643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.563604 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:20.563643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.563652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:20.569547 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.569519 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:20.925579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.925459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:20.925579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.925511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:20.930192 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:20.930165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:21.330360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:21.330332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:27:21.330538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:21.330384 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:21.393366 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:21.393334 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:26.724159 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:26.724118 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:26.724755 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:26.724174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:35.369109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:35.369068 2576 generic.go:358] "Generic (PLEG): container finished" podID="1a23807b-8ae8-4f57-aaa4-f01cc3d1f680" containerID="e41539ae9b42c7830c9a88a8c4724ffea90b84096e0883a85be6b08cf6311bd9" exitCode=0 Apr 22 19:27:35.369555 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:35.369115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" event={"ID":"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680","Type":"ContainerDied","Data":"e41539ae9b42c7830c9a88a8c4724ffea90b84096e0883a85be6b08cf6311bd9"} Apr 22 19:27:35.369555 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:35.369456 2576 scope.go:117] "RemoveContainer" containerID="e41539ae9b42c7830c9a88a8c4724ffea90b84096e0883a85be6b08cf6311bd9" Apr 22 19:27:36.191514 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:36.191478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhwkd_f8046910-c087-4b0d-a917-3216261f41d0/dns/0.log" Apr 22 19:27:36.373576 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:36.373543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-b7xnl" event={"ID":"1a23807b-8ae8-4f57-aaa4-f01cc3d1f680","Type":"ContainerStarted","Data":"40747e309e5dd7145d35a043ca1b757e2741b1826a8b0887956d22097175668a"} Apr 22 19:27:36.394647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:36.394623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhwkd_f8046910-c087-4b0d-a917-3216261f41d0/kube-rbac-proxy/0.log" Apr 22 19:27:37.591078 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:37.591051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-frj6d_9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6/dns-node-resolver/0.log" Apr 22 19:27:37.992778 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:37.992701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79d8474b76-rbnlg_9a37ca65-2a95-4238-9686-942fb21e4095/router/0.log" Apr 22 19:27:38.192001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:38.191973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9tp5r_15547e0d-8f10-470f-a80b-0cb53add2696/serve-healthcheck-canary/0.log" Apr 22 19:27:45.399054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:45.399021 2576 generic.go:358] "Generic (PLEG): container finished" podID="89e967fc-c463-4f10-9b81-910499e78afc" containerID="d30cd0dfd91b9f6d24ff4b9688815f31a7552930b485fc39d2aa71b173598e20" exitCode=0 Apr 22 19:27:45.399464 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:45.399097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" event={"ID":"89e967fc-c463-4f10-9b81-910499e78afc","Type":"ContainerDied","Data":"d30cd0dfd91b9f6d24ff4b9688815f31a7552930b485fc39d2aa71b173598e20"} Apr 22 19:27:45.399464 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:45.399398 2576 scope.go:117] "RemoveContainer" containerID="d30cd0dfd91b9f6d24ff4b9688815f31a7552930b485fc39d2aa71b173598e20" Apr 22 19:27:46.404041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:46.404007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-788qs" event={"ID":"89e967fc-c463-4f10-9b81-910499e78afc","Type":"ContainerStarted","Data":"d77ad0245fc5392df5c755a078740f6a322bb8df4bd32bd79a460951b11ba48f"} Apr 22 19:27:46.730628 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:46.730544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:46.734839 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:46.734812 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-5f4fd5879d-4728c" Apr 22 19:27:48.352023 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.351960 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f5f455d4f-vs2h4" podUID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" containerName="console" containerID="cri-o://5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457" gracePeriod=15 Apr 22 19:27:48.665687 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.665662 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f5f455d4f-vs2h4_113ae80e-54eb-4ea7-b6b1-cadca67fc9c3/console/0.log" Apr 22 19:27:48.665816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.665748 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:48.760061 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760084 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760163 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9crms\" (UniqueName: \"kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760229 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760716 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760261 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert\") pod \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\" (UID: \"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3\") " Apr 22 19:27:48.760716 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760490 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca" (OuterVolumeSpecName: "service-ca") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:48.760716 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760568 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:48.760716 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.760686 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config" (OuterVolumeSpecName: "console-config") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:48.762532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.762490 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:48.762660 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.762629 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:48.762792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.762771 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms" (OuterVolumeSpecName: "kube-api-access-9crms") pod "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" (UID: "113ae80e-54eb-4ea7-b6b1-cadca67fc9c3"). InnerVolumeSpecName "kube-api-access-9crms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:48.861902 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861870 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:48.861902 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861901 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:48.861902 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861912 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9crms\" (UniqueName: \"kubernetes.io/projected/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-kube-api-access-9crms\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:48.862112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861924 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:48.862112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861933 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:48.862112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:48.861943 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:27:49.418202 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f5f455d4f-vs2h4_113ae80e-54eb-4ea7-b6b1-cadca67fc9c3/console/0.log" Apr 22 19:27:49.418706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418206 2576 generic.go:358] "Generic (PLEG): container finished" podID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" containerID="5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457" exitCode=2 Apr 22 19:27:49.418706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f5f455d4f-vs2h4" event={"ID":"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3","Type":"ContainerDied","Data":"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457"} Apr 22 19:27:49.418706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418284 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f5f455d4f-vs2h4" Apr 22 19:27:49.418706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f5f455d4f-vs2h4" event={"ID":"113ae80e-54eb-4ea7-b6b1-cadca67fc9c3","Type":"ContainerDied","Data":"3c2c45d11b96bfb356a9d0c6981e0568f6b93ede60e759a90071ad9e33275415"} Apr 22 19:27:49.418706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.418323 2576 scope.go:117] "RemoveContainer" containerID="5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457" Apr 22 19:27:49.426342 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.426314 2576 scope.go:117] "RemoveContainer" containerID="5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457" Apr 22 19:27:49.426646 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:27:49.426619 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457\": container with ID starting with 5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457 not found: ID does not exist" containerID="5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457" Apr 22 19:27:49.426722 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.426661 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457"} err="failed to get container status \"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457\": rpc error: code = NotFound desc = could not find container \"5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457\": container with ID starting with 5915710003b8d95355bef5eed2605e98b62a69f0a221035e6728ef9f3e400457 not found: ID does not exist" Apr 22 19:27:49.442986 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.442944 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:49.446751 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:49.446727 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f5f455d4f-vs2h4"] Apr 22 19:27:50.754112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:27:50.754082 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" path="/var/lib/kubelet/pods/113ae80e-54eb-4ea7-b6b1-cadca67fc9c3/volumes" Apr 22 19:28:22.276270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276235 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:22.276891 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276852 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="alertmanager" containerID="cri-o://040f0de3f6f4da71f4a666958520c5e49c2640a10b7f0523a4ae0a80f3b57999" gracePeriod=120 Apr 22 19:28:22.276969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276906 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-metric" containerID="cri-o://70c44d14f7658093176dc1b552dded4d0ef4ae92f3599aa2a1f59039c546e351" gracePeriod=120 Apr 22 19:28:22.276969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276908 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-web" containerID="cri-o://1a73a52f9488a1a25bc92a89937146fea0f90f0971137fc37e9ce5528a46c07a" gracePeriod=120 Apr 22 19:28:22.277066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276954 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="prom-label-proxy" containerID="cri-o://beb333d80105113a2c420b79f71f7f81d9de055f337e43c03baa7a0ec6408d6a" gracePeriod=120 Apr 22 19:28:22.277066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276916 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy" containerID="cri-o://d41f41aa65bc2518836ed6bb60a93591906db7d1e49cbd1fe436b2d8ffdf437e" gracePeriod=120 Apr 22 19:28:22.277066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.276974 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="config-reloader" containerID="cri-o://1eb387f1cf203e38384ae0fcd0503c3869804f50c0037a396d86327ea02de862" gracePeriod=120 Apr 22 19:28:22.510669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510639 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="beb333d80105113a2c420b79f71f7f81d9de055f337e43c03baa7a0ec6408d6a" exitCode=0 Apr 22 19:28:22.510669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510663 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="70c44d14f7658093176dc1b552dded4d0ef4ae92f3599aa2a1f59039c546e351" exitCode=0 Apr 22 19:28:22.510669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510669 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="d41f41aa65bc2518836ed6bb60a93591906db7d1e49cbd1fe436b2d8ffdf437e" exitCode=0 Apr 22 19:28:22.510669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510675 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="1eb387f1cf203e38384ae0fcd0503c3869804f50c0037a396d86327ea02de862" exitCode=0 Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510680 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="040f0de3f6f4da71f4a666958520c5e49c2640a10b7f0523a4ae0a80f3b57999" exitCode=0 Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"beb333d80105113a2c420b79f71f7f81d9de055f337e43c03baa7a0ec6408d6a"} Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"70c44d14f7658093176dc1b552dded4d0ef4ae92f3599aa2a1f59039c546e351"} Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"d41f41aa65bc2518836ed6bb60a93591906db7d1e49cbd1fe436b2d8ffdf437e"} Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"1eb387f1cf203e38384ae0fcd0503c3869804f50c0037a396d86327ea02de862"} Apr 22 19:28:22.510901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.510774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"040f0de3f6f4da71f4a666958520c5e49c2640a10b7f0523a4ae0a80f3b57999"} Apr 22 19:28:22.518052 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.518028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:28:22.520286 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.520260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/342209dc-2f51-4fc2-a96f-a19424f86d57-metrics-certs\") pod \"network-metrics-daemon-rqq85\" (UID: \"342209dc-2f51-4fc2-a96f-a19424f86d57\") " pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:28:22.753758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.753672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tlsqf\"" Apr 22 19:28:22.761794 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.761773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rqq85" Apr 22 19:28:22.879631 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:22.879599 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rqq85"] Apr 22 19:28:22.882289 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:28:22.882263 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342209dc_2f51_4fc2_a96f_a19424f86d57.slice/crio-54bf1d658acf74565c4921049b472f3663f9e25fd4fe6a2f54f203c37c16237d WatchSource:0}: Error finding container 54bf1d658acf74565c4921049b472f3663f9e25fd4fe6a2f54f203c37c16237d: Status 404 returned error can't find the container with id 54bf1d658acf74565c4921049b472f3663f9e25fd4fe6a2f54f203c37c16237d Apr 22 19:28:23.517985 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.517871 2576 generic.go:358] "Generic (PLEG): container finished" podID="cbc20430-c16b-4431-b863-00fc63370fb0" containerID="1a73a52f9488a1a25bc92a89937146fea0f90f0971137fc37e9ce5528a46c07a" exitCode=0 Apr 22 19:28:23.517985 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.517917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"1a73a52f9488a1a25bc92a89937146fea0f90f0971137fc37e9ce5528a46c07a"} Apr 22 19:28:23.519212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.519183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rqq85" event={"ID":"342209dc-2f51-4fc2-a96f-a19424f86d57","Type":"ContainerStarted","Data":"54bf1d658acf74565c4921049b472f3663f9e25fd4fe6a2f54f203c37c16237d"} Apr 22 19:28:23.540986 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.540965 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:23.629333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629263 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629296 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629358 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629395 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629436 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629516 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629550 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629600 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-299gv\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629630 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.629669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629664 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"cbc20430-c16b-4431-b863-00fc63370fb0\" (UID: \"cbc20430-c16b-4431-b863-00fc63370fb0\") " Apr 22 19:28:23.630117 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629736 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:23.630117 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.629949 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-main-db\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.630998 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.630721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:23.631535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.631260 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:23.634358 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.634307 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out" (OuterVolumeSpecName: "config-out") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:23.634468 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.634393 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.634772 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.634697 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.634772 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.634718 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.635141 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.635102 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.635397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.635366 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv" (OuterVolumeSpecName: "kube-api-access-299gv") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "kube-api-access-299gv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:23.636017 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.635988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume" (OuterVolumeSpecName: "config-volume") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.636274 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.636250 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:23.639061 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.638658 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.645808 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.645782 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config" (OuterVolumeSpecName: "web-config") pod "cbc20430-c16b-4431-b863-00fc63370fb0" (UID: "cbc20430-c16b-4431-b863-00fc63370fb0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:23.730913 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730876 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-metrics-client-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.730913 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730914 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-main-tls\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730927 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-299gv\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-kube-api-access-299gv\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730940 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730953 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730967 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cbc20430-c16b-4431-b863-00fc63370fb0-config-out\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730979 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.730992 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-cluster-tls-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.731004 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-web-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.731016 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cbc20430-c16b-4431-b863-00fc63370fb0-tls-assets\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.731026 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cbc20430-c16b-4431-b863-00fc63370fb0-config-volume\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:23.731148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:23.731038 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cbc20430-c16b-4431-b863-00fc63370fb0-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:28:24.526336 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.526298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rqq85" event={"ID":"342209dc-2f51-4fc2-a96f-a19424f86d57","Type":"ContainerStarted","Data":"4177c509c8dc5e51b075c490694ca89ad696281fb24caafa44a8e05b9fb267e1"} Apr 22 19:28:24.526336 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.526333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rqq85" event={"ID":"342209dc-2f51-4fc2-a96f-a19424f86d57","Type":"ContainerStarted","Data":"297edeb649e1f34ee3d2e91f39489ca0679121dc3ec7d8dd1f07d1d919d2e687"} Apr 22 19:28:24.528874 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.528847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cbc20430-c16b-4431-b863-00fc63370fb0","Type":"ContainerDied","Data":"ef9e732b9c5f7f509f84f094d61148df186060afe7dc22266d993c3e5a2ff0b4"} Apr 22 19:28:24.528999 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.528889 2576 scope.go:117] "RemoveContainer" containerID="beb333d80105113a2c420b79f71f7f81d9de055f337e43c03baa7a0ec6408d6a" Apr 22 19:28:24.528999 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.528927 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.536895 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.536879 2576 scope.go:117] "RemoveContainer" containerID="70c44d14f7658093176dc1b552dded4d0ef4ae92f3599aa2a1f59039c546e351" Apr 22 19:28:24.544133 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.544115 2576 scope.go:117] "RemoveContainer" containerID="d41f41aa65bc2518836ed6bb60a93591906db7d1e49cbd1fe436b2d8ffdf437e" Apr 22 19:28:24.550311 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.550292 2576 scope.go:117] "RemoveContainer" containerID="1a73a52f9488a1a25bc92a89937146fea0f90f0971137fc37e9ce5528a46c07a" Apr 22 19:28:24.556052 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.556035 2576 scope.go:117] "RemoveContainer" containerID="1eb387f1cf203e38384ae0fcd0503c3869804f50c0037a396d86327ea02de862" Apr 22 19:28:24.557167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.557122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rqq85" podStartSLOduration=253.532729865 podStartE2EDuration="4m14.557110717s" podCreationTimestamp="2026-04-22 19:24:10 +0000 UTC" firstStartedPulling="2026-04-22 19:28:22.884038475 +0000 UTC m=+252.669028325" lastFinishedPulling="2026-04-22 19:28:23.908419316 +0000 UTC m=+253.693409177" observedRunningTime="2026-04-22 19:28:24.542415524 +0000 UTC m=+254.327405396" watchObservedRunningTime="2026-04-22 19:28:24.557110717 +0000 UTC m=+254.342100606" Apr 22 19:28:24.558424 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.558404 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:24.562823 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.562792 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:24.565060 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.565046 2576 scope.go:117] "RemoveContainer" containerID="040f0de3f6f4da71f4a666958520c5e49c2640a10b7f0523a4ae0a80f3b57999" Apr 22 19:28:24.571299 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.571274 2576 scope.go:117] "RemoveContainer" containerID="f9faa27f078dcb8ad93a769bf4f905bf652057f3df039fda9f43f86da55d00f7" Apr 22 19:28:24.588856 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.588834 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:24.589097 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589086 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-metric" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589099 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-metric" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589111 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589117 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589124 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" containerName="console" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589130 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" containerName="console" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589136 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="prom-label-proxy" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589142 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="prom-label-proxy" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589151 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="alertmanager" Apr 22 19:28:24.589153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589155 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="alertmanager" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589163 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="config-reloader" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589169 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="config-reloader" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589175 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="init-config-reloader" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589180 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="init-config-reloader" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589188 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-web" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589193 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-web" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589240 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="config-reloader" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589251 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="113ae80e-54eb-4ea7-b6b1-cadca67fc9c3" containerName="console" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589260 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="prom-label-proxy" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589268 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-metric" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589279 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="alertmanager" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589288 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy" Apr 22 19:28:24.589445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.589299 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" containerName="kube-rbac-proxy-web" Apr 22 19:28:24.594785 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.594770 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.597347 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597327 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:28:24.597453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597331 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:28:24.597453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597336 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:28:24.597453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:28:24.597453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:28:24.597657 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:28:24.597706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:28:24.597759 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.597720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:28:24.598043 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.598026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-pj4lg\"" Apr 22 19:28:24.603898 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.603875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:28:24.605329 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.605307 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:24.639934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.639897 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.639934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.639932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-volume\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7x5g\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-kube-api-access-v7x5g\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-out\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640332 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640332 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640332 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640483 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640483 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-web-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640592 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.640592 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.640565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741354 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-volume\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741354 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7x5g\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-kube-api-access-v7x5g\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-out\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-web-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.741956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.742193 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.742193 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.741988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.742447 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.742421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.742830 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.742718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5743be61-1cf4-4c4b-94fc-857af4fd539e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.744302 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.744261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-out\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.744665 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.744638 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.744774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.745024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.745209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-config-volume\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.745244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745488 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.745375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.745680 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.745660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-web-config\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.746313 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.746289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5743be61-1cf4-4c4b-94fc-857af4fd539e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.755029 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.755008 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc20430-c16b-4431-b863-00fc63370fb0" path="/var/lib/kubelet/pods/cbc20430-c16b-4431-b863-00fc63370fb0/volumes" Apr 22 19:28:24.761614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.761596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7x5g\" (UniqueName: \"kubernetes.io/projected/5743be61-1cf4-4c4b-94fc-857af4fd539e-kube-api-access-v7x5g\") pod \"alertmanager-main-0\" (UID: \"5743be61-1cf4-4c4b-94fc-857af4fd539e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:24.906308 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:24.906223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:28:25.032181 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:25.032126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:28:25.034123 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:28:25.034096 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5743be61_1cf4_4c4b_94fc_857af4fd539e.slice/crio-3b9b0929e02b354122ebd1f7413643e429a3240c66bb787e715d92e5515ceae2 WatchSource:0}: Error finding container 3b9b0929e02b354122ebd1f7413643e429a3240c66bb787e715d92e5515ceae2: Status 404 returned error can't find the container with id 3b9b0929e02b354122ebd1f7413643e429a3240c66bb787e715d92e5515ceae2 Apr 22 19:28:25.533015 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:25.532976 2576 generic.go:358] "Generic (PLEG): container finished" podID="5743be61-1cf4-4c4b-94fc-857af4fd539e" containerID="a9d7c4f32a7271ec1be9a5826c51707201643434eded0391ef0b4c4a940875a4" exitCode=0 Apr 22 19:28:25.533419 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:25.533070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerDied","Data":"a9d7c4f32a7271ec1be9a5826c51707201643434eded0391ef0b4c4a940875a4"} Apr 22 19:28:25.533419 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:25.533106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"3b9b0929e02b354122ebd1f7413643e429a3240c66bb787e715d92e5515ceae2"} Apr 22 19:28:26.346043 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.346004 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5966944dd6-gn5mz"] Apr 22 19:28:26.349339 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.349316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.352715 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.352688 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 19:28:26.354078 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.354035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 19:28:26.354078 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.354055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 19:28:26.354246 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.354041 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 19:28:26.354246 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.354146 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-rvpzd\"" Apr 22 19:28:26.355090 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.355073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 19:28:26.358610 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.358586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 19:28:26.367206 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.367184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5966944dd6-gn5mz"] Apr 22 19:28:26.457142 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457142 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4477w\" (UniqueName: \"kubernetes.io/projected/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-kube-api-access-4477w\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457377 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457377 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457177 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-metrics-client-ca\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457377 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-federate-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457377 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457589 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.457589 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.457431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-serving-certs-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.540490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"4a83b160d9b7d276323c7641f7abe65225e4960bcbecf3726abd2e8e34b86f7a"} Apr 22 19:28:26.540490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"83373c7fff1ab276cc7336d7d521a34ae531efe2eafcc23455ffa9562ef581d7"} Apr 22 19:28:26.540906 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"6a7434b4567f668b33e532bcf22df4b03de53d5b5d7ad9764afe29f1b9ccd1ff"} Apr 22 19:28:26.540906 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"74e25426026ebeea0560abf008542783f996b3f33902f6ed59bf330f04df493c"} Apr 22 19:28:26.540906 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"31ad845244723775f82008cd218a93574de6dffcf58bf08756e4c26343c04ae9"} Apr 22 19:28:26.540906 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.540547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5743be61-1cf4-4c4b-94fc-857af4fd539e","Type":"ContainerStarted","Data":"3510f1dd4f331ae8d9d29e1d0d96097a73754b475d1b9633738ac80a27320316"} Apr 22 19:28:26.558701 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-federate-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558754 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558754 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558810 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-serving-certs-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558810 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4477w\" (UniqueName: \"kubernetes.io/projected/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-kube-api-access-4477w\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.558901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.558862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-metrics-client-ca\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.559706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.559676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-serving-certs-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.559862 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.559704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-metrics-client-ca\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.559957 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.559939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.561429 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.561410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-telemeter-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.561542 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.561489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.561627 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.561599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-secret-telemeter-client\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.561674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.561663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-federate-client-tls\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.568209 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.568188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4477w\" (UniqueName: \"kubernetes.io/projected/1ee74941-d14c-462e-a4a9-e8ed9058b8dc-kube-api-access-4477w\") pod \"telemeter-client-5966944dd6-gn5mz\" (UID: \"1ee74941-d14c-462e-a4a9-e8ed9058b8dc\") " pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.570708 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.570665 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.570651727 podStartE2EDuration="2.570651727s" podCreationTimestamp="2026-04-22 19:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:28:26.56890768 +0000 UTC m=+256.353897568" watchObservedRunningTime="2026-04-22 19:28:26.570651727 +0000 UTC m=+256.355641598" Apr 22 19:28:26.659841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.659755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" Apr 22 19:28:26.790682 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:26.790651 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5966944dd6-gn5mz"] Apr 22 19:28:26.794300 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:28:26.794272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee74941_d14c_462e_a4a9_e8ed9058b8dc.slice/crio-5b0aba2a622f23a65d87e3f5e078491a8e3d9ce9a2ede255b9539c07054e0337 WatchSource:0}: Error finding container 5b0aba2a622f23a65d87e3f5e078491a8e3d9ce9a2ede255b9539c07054e0337: Status 404 returned error can't find the container with id 5b0aba2a622f23a65d87e3f5e078491a8e3d9ce9a2ede255b9539c07054e0337 Apr 22 19:28:27.545594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:27.545554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" event={"ID":"1ee74941-d14c-462e-a4a9-e8ed9058b8dc","Type":"ContainerStarted","Data":"5b0aba2a622f23a65d87e3f5e078491a8e3d9ce9a2ede255b9539c07054e0337"} Apr 22 19:28:28.549770 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:28.549733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" event={"ID":"1ee74941-d14c-462e-a4a9-e8ed9058b8dc","Type":"ContainerStarted","Data":"2bb1d1716ddb9b964599b364abf373227d4e8f9d99713bc96197a3d1c68a7e9b"} Apr 22 19:28:29.554677 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:29.554645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" event={"ID":"1ee74941-d14c-462e-a4a9-e8ed9058b8dc","Type":"ContainerStarted","Data":"64a19cfda8086cd5ee9fa2e716e6bf76ac9e1efe8b4a63caa0e8e956e6a7b569"} Apr 22 19:28:29.554677 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:29.554680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" event={"ID":"1ee74941-d14c-462e-a4a9-e8ed9058b8dc","Type":"ContainerStarted","Data":"a1dcacbc901d69a1c5f754ef3cf473110ba4c915792c247f023a1db9faa9ea86"} Apr 22 19:28:29.582155 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:29.582097 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5966944dd6-gn5mz" podStartSLOduration=1.894791795 podStartE2EDuration="3.582078452s" podCreationTimestamp="2026-04-22 19:28:26 +0000 UTC" firstStartedPulling="2026-04-22 19:28:26.79620629 +0000 UTC m=+256.581196140" lastFinishedPulling="2026-04-22 19:28:28.483492946 +0000 UTC m=+258.268482797" observedRunningTime="2026-04-22 19:28:29.58014006 +0000 UTC m=+259.365129932" watchObservedRunningTime="2026-04-22 19:28:29.582078452 +0000 UTC m=+259.367068325" Apr 22 19:28:30.127003 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.126973 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:28:30.130612 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.130588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.158776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.158743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:28:30.193739 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.193910 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.193910 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.193910 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193884 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.194069 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.194069 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.194069 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.193988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjq9\" (UniqueName: \"kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.294842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btjq9\" (UniqueName: \"kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.294989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295736 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.295706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.295867 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.295778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.296029 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.296003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.296203 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.296184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.297327 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.297308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.297467 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.297449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.305197 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.305178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjq9\" (UniqueName: \"kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9\") pod \"console-dfc85b5d-828lv\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.440304 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.440218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:30.568806 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:30.568691 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:28:30.570999 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:28:30.570971 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b4bbc3_5c33_49da_b312_a490811fa401.slice/crio-14e6e106f0a3fd32f8d945d6ad5540726016436e14b79a8e25ea4fce0fa33ce4 WatchSource:0}: Error finding container 14e6e106f0a3fd32f8d945d6ad5540726016436e14b79a8e25ea4fce0fa33ce4: Status 404 returned error can't find the container with id 14e6e106f0a3fd32f8d945d6ad5540726016436e14b79a8e25ea4fce0fa33ce4 Apr 22 19:28:31.561585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:31.561549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfc85b5d-828lv" event={"ID":"69b4bbc3-5c33-49da-b312-a490811fa401","Type":"ContainerStarted","Data":"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509"} Apr 22 19:28:31.561585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:31.561589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfc85b5d-828lv" event={"ID":"69b4bbc3-5c33-49da-b312-a490811fa401","Type":"ContainerStarted","Data":"14e6e106f0a3fd32f8d945d6ad5540726016436e14b79a8e25ea4fce0fa33ce4"} Apr 22 19:28:31.582487 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:31.582441 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-dfc85b5d-828lv" podStartSLOduration=1.5824248330000001 podStartE2EDuration="1.582424833s" podCreationTimestamp="2026-04-22 19:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:28:31.580889281 +0000 UTC m=+261.365879178" watchObservedRunningTime="2026-04-22 19:28:31.582424833 +0000 UTC m=+261.367414705" Apr 22 19:28:40.440568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:40.440532 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:40.440568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:40.440577 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:40.445083 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:40.445061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:40.591337 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:40.591308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:28:40.645429 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:28:40.645397 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:29:05.665718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:05.665656 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c8d88db7-552xk" podUID="c7637f0b-c587-4c39-b962-34f0fe988685" containerName="console" containerID="cri-o://733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6" gracePeriod=15 Apr 22 19:29:05.902214 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:05.902193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c8d88db7-552xk_c7637f0b-c587-4c39-b962-34f0fe988685/console/0.log" Apr 22 19:29:05.902329 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:05.902252 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:29:06.012321 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012233 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012321 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012270 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbfzw\" (UniqueName: \"kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012321 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012296 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012348 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012388 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012414 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012636 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config\") pod \"c7637f0b-c587-4c39-b962-34f0fe988685\" (UID: \"c7637f0b-c587-4c39-b962-34f0fe988685\") " Apr 22 19:29:06.012833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012769 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:06.012888 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012843 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca" (OuterVolumeSpecName: "service-ca") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:06.012949 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012877 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:06.012949 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.012899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config" (OuterVolumeSpecName: "console-config") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:29:06.014563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.014542 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:29:06.014646 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.014581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw" (OuterVolumeSpecName: "kube-api-access-mbfzw") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "kube-api-access-mbfzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:29:06.014769 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.014747 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c7637f0b-c587-4c39-b962-34f0fe988685" (UID: "c7637f0b-c587-4c39-b962-34f0fe988685"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:29:06.114040 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114001 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114040 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114033 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114040 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114043 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbfzw\" (UniqueName: \"kubernetes.io/projected/c7637f0b-c587-4c39-b962-34f0fe988685-kube-api-access-mbfzw\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114053 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114063 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c7637f0b-c587-4c39-b962-34f0fe988685-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114071 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.114270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.114080 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7637f0b-c587-4c39-b962-34f0fe988685-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:29:06.664237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c8d88db7-552xk_c7637f0b-c587-4c39-b962-34f0fe988685/console/0.log" Apr 22 19:29:06.664402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664251 2576 generic.go:358] "Generic (PLEG): container finished" podID="c7637f0b-c587-4c39-b962-34f0fe988685" containerID="733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6" exitCode=2 Apr 22 19:29:06.664402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c8d88db7-552xk" event={"ID":"c7637f0b-c587-4c39-b962-34f0fe988685","Type":"ContainerDied","Data":"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6"} Apr 22 19:29:06.664402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c8d88db7-552xk" event={"ID":"c7637f0b-c587-4c39-b962-34f0fe988685","Type":"ContainerDied","Data":"eb34c9213292fdcbb9ced37714497ffa7412bc34944b9c3260405393f9918412"} Apr 22 19:29:06.664402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664370 2576 scope.go:117] "RemoveContainer" containerID="733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6" Apr 22 19:29:06.664640 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.664322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c8d88db7-552xk" Apr 22 19:29:06.676000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.675831 2576 scope.go:117] "RemoveContainer" containerID="733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6" Apr 22 19:29:06.676212 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:29:06.676075 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6\": container with ID starting with 733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6 not found: ID does not exist" containerID="733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6" Apr 22 19:29:06.676212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.676100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6"} err="failed to get container status \"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6\": rpc error: code = NotFound desc = could not find container \"733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6\": container with ID starting with 733eaca788008210c49d76562406d657d0619709966e391a3efe63734c794fa6 not found: ID does not exist" Apr 22 19:29:06.687760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.687739 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:29:06.691764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.691743 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c8d88db7-552xk"] Apr 22 19:29:06.754154 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:06.754129 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7637f0b-c587-4c39-b962-34f0fe988685" path="/var/lib/kubelet/pods/c7637f0b-c587-4c39-b962-34f0fe988685/volumes" Apr 22 19:29:10.643744 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:10.643711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:29:10.645051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:10.645031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:29:10.647818 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:10.647797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:29:10.649105 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:10.649084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:29:10.654402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:10.654381 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:29:49.502790 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.502757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:29:49.503289 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.503215 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7637f0b-c587-4c39-b962-34f0fe988685" containerName="console" Apr 22 19:29:49.503289 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.503236 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7637f0b-c587-4c39-b962-34f0fe988685" containerName="console" Apr 22 19:29:49.503402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.503315 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7637f0b-c587-4c39-b962-34f0fe988685" containerName="console" Apr 22 19:29:49.510704 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.510679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.517991 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.517957 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:29:49.587459 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587459 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587561 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddr4\" (UniqueName: \"kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587598 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.587672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.587637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688580 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pddr4\" (UniqueName: \"kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.688777 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.689038 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.688790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.689567 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.689542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.689637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.689549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.689637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.689617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.689905 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.689883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.691223 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.691199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.691318 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.691229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.696617 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.696598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddr4\" (UniqueName: \"kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4\") pod \"console-777859b957-v65cx\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.827046 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.827009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:49.953085 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.953058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:29:49.955751 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:29:49.955714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a673d89_1c34_4821_9944_d9156391059a.slice/crio-3afb93d71d66e4e621597e0cb3aef5ebe0a3bb767a8c105b34873e8401948a40 WatchSource:0}: Error finding container 3afb93d71d66e4e621597e0cb3aef5ebe0a3bb767a8c105b34873e8401948a40: Status 404 returned error can't find the container with id 3afb93d71d66e4e621597e0cb3aef5ebe0a3bb767a8c105b34873e8401948a40 Apr 22 19:29:49.957460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:49.957440 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:29:50.786266 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:50.786231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777859b957-v65cx" event={"ID":"6a673d89-1c34-4821-9944-d9156391059a","Type":"ContainerStarted","Data":"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7"} Apr 22 19:29:50.786266 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:50.786267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777859b957-v65cx" event={"ID":"6a673d89-1c34-4821-9944-d9156391059a","Type":"ContainerStarted","Data":"3afb93d71d66e4e621597e0cb3aef5ebe0a3bb767a8c105b34873e8401948a40"} Apr 22 19:29:50.805027 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:50.804981 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-777859b957-v65cx" podStartSLOduration=1.80496737 podStartE2EDuration="1.80496737s" podCreationTimestamp="2026-04-22 19:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:29:50.803301289 +0000 UTC m=+340.588291174" watchObservedRunningTime="2026-04-22 19:29:50.80496737 +0000 UTC m=+340.589957241" Apr 22 19:29:59.827186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:59.827156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:59.827186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:59.827194 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:29:59.831790 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:29:59.831765 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:30:00.827832 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:00.827805 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:30:00.876366 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:00.876332 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:30:25.899139 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:25.899065 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-dfc85b5d-828lv" podUID="69b4bbc3-5c33-49da-b312-a490811fa401" containerName="console" containerID="cri-o://b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509" gracePeriod=15 Apr 22 19:30:26.128965 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.128943 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfc85b5d-828lv_69b4bbc3-5c33-49da-b312-a490811fa401/console/0.log" Apr 22 19:30:26.129064 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.129004 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:30:26.305201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305214 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btjq9\" (UniqueName: \"kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305260 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305311 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305355 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305411 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305448 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca\") pod \"69b4bbc3-5c33-49da-b312-a490811fa401\" (UID: \"69b4bbc3-5c33-49da-b312-a490811fa401\") " Apr 22 19:30:26.305765 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305735 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config" (OuterVolumeSpecName: "console-config") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:26.305812 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305749 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:26.305875 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305798 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca" (OuterVolumeSpecName: "service-ca") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:26.305875 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.305853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:26.307558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.307524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:26.307788 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.307767 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:26.307903 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.307786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9" (OuterVolumeSpecName: "kube-api-access-btjq9") pod "69b4bbc3-5c33-49da-b312-a490811fa401" (UID: "69b4bbc3-5c33-49da-b312-a490811fa401"). InnerVolumeSpecName "kube-api-access-btjq9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:30:26.406446 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406412 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406446 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406439 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406446 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406450 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69b4bbc3-5c33-49da-b312-a490811fa401-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406460 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406469 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406477 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69b4bbc3-5c33-49da-b312-a490811fa401-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.406674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.406486 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-btjq9\" (UniqueName: \"kubernetes.io/projected/69b4bbc3-5c33-49da-b312-a490811fa401-kube-api-access-btjq9\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:30:26.903061 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-dfc85b5d-828lv_69b4bbc3-5c33-49da-b312-a490811fa401/console/0.log" Apr 22 19:30:26.903455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903079 2576 generic.go:358] "Generic (PLEG): container finished" podID="69b4bbc3-5c33-49da-b312-a490811fa401" containerID="b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509" exitCode=2 Apr 22 19:30:26.903455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903138 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-dfc85b5d-828lv" Apr 22 19:30:26.903455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfc85b5d-828lv" event={"ID":"69b4bbc3-5c33-49da-b312-a490811fa401","Type":"ContainerDied","Data":"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509"} Apr 22 19:30:26.903455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-dfc85b5d-828lv" event={"ID":"69b4bbc3-5c33-49da-b312-a490811fa401","Type":"ContainerDied","Data":"14e6e106f0a3fd32f8d945d6ad5540726016436e14b79a8e25ea4fce0fa33ce4"} Apr 22 19:30:26.903455 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.903207 2576 scope.go:117] "RemoveContainer" containerID="b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509" Apr 22 19:30:26.910965 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.910948 2576 scope.go:117] "RemoveContainer" containerID="b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509" Apr 22 19:30:26.911222 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:30:26.911199 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509\": container with ID starting with b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509 not found: ID does not exist" containerID="b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509" Apr 22 19:30:26.911277 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.911230 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509"} err="failed to get container status \"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509\": rpc error: code = NotFound desc = could not find container \"b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509\": container with ID starting with b4d5f2dd380f1e2b1101077aeec091906ae151910f642713005115be11ea6509 not found: ID does not exist" Apr 22 19:30:26.920884 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.920825 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:30:26.923323 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.923305 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-dfc85b5d-828lv"] Apr 22 19:30:26.980167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.980136 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8c65l"] Apr 22 19:30:26.980439 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.980426 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69b4bbc3-5c33-49da-b312-a490811fa401" containerName="console" Apr 22 19:30:26.980439 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.980439 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="69b4bbc3-5c33-49da-b312-a490811fa401" containerName="console" Apr 22 19:30:26.980604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.980521 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="69b4bbc3-5c33-49da-b312-a490811fa401" containerName="console" Apr 22 19:30:26.984457 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.984442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:26.986797 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.986780 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:30:26.989853 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:26.989825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8c65l"] Apr 22 19:30:27.113298 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.113265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-kubelet-config\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.113494 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.113321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9d8901e-a9f8-4dab-bc05-d442f52d1851-original-pull-secret\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.113494 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.113411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-dbus\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.214260 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.214228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9d8901e-a9f8-4dab-bc05-d442f52d1851-original-pull-secret\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.214434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.214277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-dbus\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.214434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.214406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-kubelet-config\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.214434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.214420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-dbus\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.214591 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.214471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f9d8901e-a9f8-4dab-bc05-d442f52d1851-kubelet-config\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.216646 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.216619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f9d8901e-a9f8-4dab-bc05-d442f52d1851-original-pull-secret\") pod \"global-pull-secret-syncer-8c65l\" (UID: \"f9d8901e-a9f8-4dab-bc05-d442f52d1851\") " pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.293739 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.293697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8c65l" Apr 22 19:30:27.411109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.411079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8c65l"] Apr 22 19:30:27.414128 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:30:27.414103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d8901e_a9f8_4dab_bc05_d442f52d1851.slice/crio-83d9723eec33cd7067e2b084c0987f4d1299e62b011b2e66c3eae16be56b789c WatchSource:0}: Error finding container 83d9723eec33cd7067e2b084c0987f4d1299e62b011b2e66c3eae16be56b789c: Status 404 returned error can't find the container with id 83d9723eec33cd7067e2b084c0987f4d1299e62b011b2e66c3eae16be56b789c Apr 22 19:30:27.908885 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:27.908841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8c65l" event={"ID":"f9d8901e-a9f8-4dab-bc05-d442f52d1851","Type":"ContainerStarted","Data":"83d9723eec33cd7067e2b084c0987f4d1299e62b011b2e66c3eae16be56b789c"} Apr 22 19:30:28.755361 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:28.755320 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69b4bbc3-5c33-49da-b312-a490811fa401" path="/var/lib/kubelet/pods/69b4bbc3-5c33-49da-b312-a490811fa401/volumes" Apr 22 19:30:31.922236 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:31.922193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8c65l" event={"ID":"f9d8901e-a9f8-4dab-bc05-d442f52d1851","Type":"ContainerStarted","Data":"8c047cbc222f466ad920dad02c52c53478e7423604f68a32f318d030bfd243da"} Apr 22 19:30:31.939290 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:30:31.939244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8c65l" podStartSLOduration=2.254520477 podStartE2EDuration="5.939229893s" podCreationTimestamp="2026-04-22 19:30:26 +0000 UTC" firstStartedPulling="2026-04-22 19:30:27.416007729 +0000 UTC m=+377.200997595" lastFinishedPulling="2026-04-22 19:30:31.100717153 +0000 UTC m=+380.885707011" observedRunningTime="2026-04-22 19:30:31.938412025 +0000 UTC m=+381.723401896" watchObservedRunningTime="2026-04-22 19:30:31.939229893 +0000 UTC m=+381.724219764" Apr 22 19:31:06.416912 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.416872 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl"] Apr 22 19:31:06.420847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.420830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.424635 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.424614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 19:31:06.424746 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.424658 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 19:31:06.424746 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.424664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gqv5g\"" Apr 22 19:31:06.424956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.424940 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 19:31:06.434937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.434912 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl"] Apr 22 19:31:06.531790 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.531759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.531967 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.531797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkt6\" (UniqueName: \"kubernetes.io/projected/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-kube-api-access-lqkt6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.638217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.632813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.638217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.632934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkt6\" (UniqueName: \"kubernetes.io/projected/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-kube-api-access-lqkt6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.638217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.635738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.642178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.642150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkt6\" (UniqueName: \"kubernetes.io/projected/c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4-kube-api-access-lqkt6\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl\" (UID: \"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.730881 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.730808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:06.859260 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:06.859120 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl"] Apr 22 19:31:06.862417 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:31:06.862380 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc55e0d2a_46cd_4ecc_8619_02e3b5c43ef4.slice/crio-23fdd9e6ec27b27c84532b7214e5b8a11596a9018e479f9c4b170808f065e197 WatchSource:0}: Error finding container 23fdd9e6ec27b27c84532b7214e5b8a11596a9018e479f9c4b170808f065e197: Status 404 returned error can't find the container with id 23fdd9e6ec27b27c84532b7214e5b8a11596a9018e479f9c4b170808f065e197 Apr 22 19:31:07.024948 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:07.024917 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" event={"ID":"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4","Type":"ContainerStarted","Data":"23fdd9e6ec27b27c84532b7214e5b8a11596a9018e479f9c4b170808f065e197"} Apr 22 19:31:11.045821 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:11.045783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" event={"ID":"c55e0d2a-46cd-4ecc-8619-02e3b5c43ef4","Type":"ContainerStarted","Data":"0481f93d8dfa5de42066d9f65adc9434bffc6e31253dd7b367ac6897dbda58ba"} Apr 22 19:31:11.046210 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:11.045918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:31:11.068389 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:11.068342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" podStartSLOduration=1.2625218089999999 podStartE2EDuration="5.068329419s" podCreationTimestamp="2026-04-22 19:31:06 +0000 UTC" firstStartedPulling="2026-04-22 19:31:06.864637159 +0000 UTC m=+416.649627009" lastFinishedPulling="2026-04-22 19:31:10.670444766 +0000 UTC m=+420.455434619" observedRunningTime="2026-04-22 19:31:11.066584212 +0000 UTC m=+420.851574083" watchObservedRunningTime="2026-04-22 19:31:11.068329419 +0000 UTC m=+420.853319291" Apr 22 19:31:32.051723 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:31:32.051648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-dhhxl" Apr 22 19:32:17.715577 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.715544 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x"] Apr 22 19:32:17.718780 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.718762 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.721354 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.721330 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:32:17.721702 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.721657 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 19:32:17.721835 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.721818 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-w5vd8\"" Apr 22 19:32:17.728316 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.728294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x"] Apr 22 19:32:17.835038 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.835009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6k7p\" (UniqueName: \"kubernetes.io/projected/7cc79451-3528-46ac-b40a-da5ac7da0cc1-kube-api-access-t6k7p\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.835209 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.835068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc79451-3528-46ac-b40a-da5ac7da0cc1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.936340 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.936287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc79451-3528-46ac-b40a-da5ac7da0cc1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.936552 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.936414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6k7p\" (UniqueName: \"kubernetes.io/projected/7cc79451-3528-46ac-b40a-da5ac7da0cc1-kube-api-access-t6k7p\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.936707 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.936689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7cc79451-3528-46ac-b40a-da5ac7da0cc1-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:17.947393 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:17.947369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6k7p\" (UniqueName: \"kubernetes.io/projected/7cc79451-3528-46ac-b40a-da5ac7da0cc1-kube-api-access-t6k7p\") pod \"cert-manager-operator-controller-manager-54b9655956-qjq2x\" (UID: \"7cc79451-3528-46ac-b40a-da5ac7da0cc1\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:18.028599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:18.028569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" Apr 22 19:32:18.153114 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:18.153084 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x"] Apr 22 19:32:18.156348 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:32:18.156320 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc79451_3528_46ac_b40a_da5ac7da0cc1.slice/crio-66033b3a86c5f07c573e0a2eb305300d0cc87f48188b475647c29b8c76fc0ab9 WatchSource:0}: Error finding container 66033b3a86c5f07c573e0a2eb305300d0cc87f48188b475647c29b8c76fc0ab9: Status 404 returned error can't find the container with id 66033b3a86c5f07c573e0a2eb305300d0cc87f48188b475647c29b8c76fc0ab9 Apr 22 19:32:18.259695 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:18.259659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" event={"ID":"7cc79451-3528-46ac-b40a-da5ac7da0cc1","Type":"ContainerStarted","Data":"66033b3a86c5f07c573e0a2eb305300d0cc87f48188b475647c29b8c76fc0ab9"} Apr 22 19:32:21.272129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:21.272084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" event={"ID":"7cc79451-3528-46ac-b40a-da5ac7da0cc1","Type":"ContainerStarted","Data":"5048d9275c7d3e8549d8cc866e26e1d1658521a31e3137354d6648866ef1db12"} Apr 22 19:32:21.297880 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:21.297818 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-qjq2x" podStartSLOduration=1.679892894 podStartE2EDuration="4.297799692s" podCreationTimestamp="2026-04-22 19:32:17 +0000 UTC" firstStartedPulling="2026-04-22 19:32:18.159445676 +0000 UTC m=+487.944435530" lastFinishedPulling="2026-04-22 19:32:20.777352469 +0000 UTC m=+490.562342328" observedRunningTime="2026-04-22 19:32:21.296910521 +0000 UTC m=+491.081900394" watchObservedRunningTime="2026-04-22 19:32:21.297799692 +0000 UTC m=+491.082789564" Apr 22 19:32:34.162880 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.162848 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-qltnx"] Apr 22 19:32:34.166090 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.166066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.168699 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.168672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-jdvk7\"" Apr 22 19:32:34.168825 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.168676 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 19:32:34.169681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.169667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 19:32:34.176016 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.175993 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-qltnx"] Apr 22 19:32:34.263341 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.263301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l4rs\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-kube-api-access-2l4rs\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.263553 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.263354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-bound-sa-token\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.364421 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.364388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-bound-sa-token\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.364615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.364478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l4rs\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-kube-api-access-2l4rs\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.372396 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.372372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-bound-sa-token\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.372557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.372539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l4rs\" (UniqueName: \"kubernetes.io/projected/8543c275-36f9-44ec-87ba-d2eb3d0fd768-kube-api-access-2l4rs\") pod \"cert-manager-79c8d999ff-qltnx\" (UID: \"8543c275-36f9-44ec-87ba-d2eb3d0fd768\") " pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.487557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.487444 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-qltnx" Apr 22 19:32:34.605685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:34.605653 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-qltnx"] Apr 22 19:32:34.608865 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:32:34.608835 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8543c275_36f9_44ec_87ba_d2eb3d0fd768.slice/crio-3cacb60b408e392cace55f3496a0f010491e65d7c3e915ed1601082612bdd3d1 WatchSource:0}: Error finding container 3cacb60b408e392cace55f3496a0f010491e65d7c3e915ed1601082612bdd3d1: Status 404 returned error can't find the container with id 3cacb60b408e392cace55f3496a0f010491e65d7c3e915ed1601082612bdd3d1 Apr 22 19:32:35.321568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:35.321524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-qltnx" event={"ID":"8543c275-36f9-44ec-87ba-d2eb3d0fd768","Type":"ContainerStarted","Data":"3cacb60b408e392cace55f3496a0f010491e65d7c3e915ed1601082612bdd3d1"} Apr 22 19:32:37.330694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:37.330663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-qltnx" event={"ID":"8543c275-36f9-44ec-87ba-d2eb3d0fd768","Type":"ContainerStarted","Data":"aebd8e54462d412a2ca1b01cd9aa6cb94434eb0f863e42e4c400fa1bb788bc5a"} Apr 22 19:32:37.347118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:37.347071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-qltnx" podStartSLOduration=1.147528892 podStartE2EDuration="3.347058376s" podCreationTimestamp="2026-04-22 19:32:34 +0000 UTC" firstStartedPulling="2026-04-22 19:32:34.610621441 +0000 UTC m=+504.395611292" lastFinishedPulling="2026-04-22 19:32:36.810150923 +0000 UTC m=+506.595140776" observedRunningTime="2026-04-22 19:32:37.345633125 +0000 UTC m=+507.130622996" watchObservedRunningTime="2026-04-22 19:32:37.347058376 +0000 UTC m=+507.132048245" Apr 22 19:32:39.666076 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.666033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r"] Apr 22 19:32:39.669261 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.669244 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.672051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.672029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 19:32:39.672186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.672137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-qs4xp\"" Apr 22 19:32:39.672249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.672212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:32:39.680119 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.680093 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r"] Apr 22 19:32:39.708261 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.708217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b50d0c68-2415-4ac0-ac90-80fd5064f054-tmp\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.708429 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.708271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbpxv\" (UniqueName: \"kubernetes.io/projected/b50d0c68-2415-4ac0-ac90-80fd5064f054-kube-api-access-nbpxv\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.809778 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.809743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b50d0c68-2415-4ac0-ac90-80fd5064f054-tmp\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.809973 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.809785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbpxv\" (UniqueName: \"kubernetes.io/projected/b50d0c68-2415-4ac0-ac90-80fd5064f054-kube-api-access-nbpxv\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.810196 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.810170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b50d0c68-2415-4ac0-ac90-80fd5064f054-tmp\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.818859 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.818838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbpxv\" (UniqueName: \"kubernetes.io/projected/b50d0c68-2415-4ac0-ac90-80fd5064f054-kube-api-access-nbpxv\") pod \"openshift-lws-operator-bfc7f696d-wcs7r\" (UID: \"b50d0c68-2415-4ac0-ac90-80fd5064f054\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:39.978585 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:39.978487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" Apr 22 19:32:40.102132 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:40.102058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r"] Apr 22 19:32:40.104415 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:32:40.104386 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb50d0c68_2415_4ac0_ac90_80fd5064f054.slice/crio-bd7d4de5b7df9def3abed3ebab6d401db4b0c0a9003ff5ad188588f234bddf9d WatchSource:0}: Error finding container bd7d4de5b7df9def3abed3ebab6d401db4b0c0a9003ff5ad188588f234bddf9d: Status 404 returned error can't find the container with id bd7d4de5b7df9def3abed3ebab6d401db4b0c0a9003ff5ad188588f234bddf9d Apr 22 19:32:40.341701 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:40.341667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" event={"ID":"b50d0c68-2415-4ac0-ac90-80fd5064f054","Type":"ContainerStarted","Data":"bd7d4de5b7df9def3abed3ebab6d401db4b0c0a9003ff5ad188588f234bddf9d"} Apr 22 19:32:43.353541 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:43.353492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" event={"ID":"b50d0c68-2415-4ac0-ac90-80fd5064f054","Type":"ContainerStarted","Data":"766de959fb83469640d5fd9f6e0939cb18dc5607934599a8aab9f4436b066ec6"} Apr 22 19:32:43.388275 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:32:43.388225 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-wcs7r" podStartSLOduration=1.786364276 podStartE2EDuration="4.388210826s" podCreationTimestamp="2026-04-22 19:32:39 +0000 UTC" firstStartedPulling="2026-04-22 19:32:40.105853594 +0000 UTC m=+509.890843448" lastFinishedPulling="2026-04-22 19:32:42.707700144 +0000 UTC m=+512.492689998" observedRunningTime="2026-04-22 19:32:43.387727248 +0000 UTC m=+513.172717121" watchObservedRunningTime="2026-04-22 19:32:43.388210826 +0000 UTC m=+513.173200698" Apr 22 19:33:06.113148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.113117 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-4dslq"] Apr 22 19:33:06.116461 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.116437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.120391 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.120365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 19:33:06.120550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.120421 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 19:33:06.121494 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.121478 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 19:33:06.121590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.121576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-ghr98\"" Apr 22 19:33:06.131340 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.131317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-4dslq"] Apr 22 19:33:06.231763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.231736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-metrics-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.231922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.231773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.231922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.231874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fc5\" (UniqueName: \"kubernetes.io/projected/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-kube-api-access-44fc5\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.232038 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.231937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-manager-config\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.332910 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.332870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-manager-config\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.333093 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.332932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-metrics-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.333093 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.332962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.333093 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.333032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44fc5\" (UniqueName: \"kubernetes.io/projected/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-kube-api-access-44fc5\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.333640 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.333616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-manager-config\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.335317 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.335300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-metrics-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.335469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.335446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-cert\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.343357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.343327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fc5\" (UniqueName: \"kubernetes.io/projected/86a155d5-abb8-4ef0-9aa5-a2e58e3312df-kube-api-access-44fc5\") pod \"lws-controller-manager-98c76994c-4dslq\" (UID: \"86a155d5-abb8-4ef0-9aa5-a2e58e3312df\") " pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.425543 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.425459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:06.560540 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:06.560480 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-98c76994c-4dslq"] Apr 22 19:33:06.564375 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:33:06.564346 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a155d5_abb8_4ef0_9aa5_a2e58e3312df.slice/crio-5d0b38a2a26574af1d3d63e9002eb67742c06b8574d1c8a9d84413e7a8c0e91b WatchSource:0}: Error finding container 5d0b38a2a26574af1d3d63e9002eb67742c06b8574d1c8a9d84413e7a8c0e91b: Status 404 returned error can't find the container with id 5d0b38a2a26574af1d3d63e9002eb67742c06b8574d1c8a9d84413e7a8c0e91b Apr 22 19:33:07.431351 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:07.431314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" event={"ID":"86a155d5-abb8-4ef0-9aa5-a2e58e3312df","Type":"ContainerStarted","Data":"5d0b38a2a26574af1d3d63e9002eb67742c06b8574d1c8a9d84413e7a8c0e91b"} Apr 22 19:33:10.443357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:10.443323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" event={"ID":"86a155d5-abb8-4ef0-9aa5-a2e58e3312df","Type":"ContainerStarted","Data":"bf2d0530e10773a940ebccdbfa6362ba7bca445e830d9dca1d083964e98a0b8e"} Apr 22 19:33:10.443772 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:10.443433 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:10.462277 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:10.462234 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" podStartSLOduration=0.813922449 podStartE2EDuration="4.462223501s" podCreationTimestamp="2026-04-22 19:33:06 +0000 UTC" firstStartedPulling="2026-04-22 19:33:06.566325954 +0000 UTC m=+536.351315821" lastFinishedPulling="2026-04-22 19:33:10.214627023 +0000 UTC m=+539.999616873" observedRunningTime="2026-04-22 19:33:10.460627627 +0000 UTC m=+540.245617535" watchObservedRunningTime="2026-04-22 19:33:10.462223501 +0000 UTC m=+540.247213373" Apr 22 19:33:21.449107 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:21.449075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-98c76994c-4dslq" Apr 22 19:33:32.602429 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.602394 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-954dd468f-bvk7r"] Apr 22 19:33:32.604762 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.604741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.616563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.616521 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-954dd468f-bvk7r"] Apr 22 19:33:32.651676 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.651635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-oauth-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.651676 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.651669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.651892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.651693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-trusted-ca-bundle\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.651892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.651828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.651994 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.651891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmx8\" (UniqueName: \"kubernetes.io/projected/2f956b7a-e61e-497e-bf6a-382a5c64e13e-kube-api-access-xkmx8\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.652049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.652003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-service-ca\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.652049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.652040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-oauth-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752571 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-service-ca\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-oauth-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-oauth-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-trusted-ca-bundle\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.752936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.752856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmx8\" (UniqueName: \"kubernetes.io/projected/2f956b7a-e61e-497e-bf6a-382a5c64e13e-kube-api-access-xkmx8\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.753462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.753435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-service-ca\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.753645 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.753483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-oauth-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.753781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.753692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.753847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.753810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f956b7a-e61e-497e-bf6a-382a5c64e13e-trusted-ca-bundle\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.755302 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.755278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-serving-cert\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.755397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.755336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f956b7a-e61e-497e-bf6a-382a5c64e13e-console-oauth-config\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.764063 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.764045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmx8\" (UniqueName: \"kubernetes.io/projected/2f956b7a-e61e-497e-bf6a-382a5c64e13e-kube-api-access-xkmx8\") pod \"console-954dd468f-bvk7r\" (UID: \"2f956b7a-e61e-497e-bf6a-382a5c64e13e\") " pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:32.916083 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:32.916003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:33.044780 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:33.044756 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-954dd468f-bvk7r"] Apr 22 19:33:33.047280 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:33:33.047249 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f956b7a_e61e_497e_bf6a_382a5c64e13e.slice/crio-8feb4072b267b6e87bc89a8fb2d7154735a3d22a82bd550290dae128a3d2884d WatchSource:0}: Error finding container 8feb4072b267b6e87bc89a8fb2d7154735a3d22a82bd550290dae128a3d2884d: Status 404 returned error can't find the container with id 8feb4072b267b6e87bc89a8fb2d7154735a3d22a82bd550290dae128a3d2884d Apr 22 19:33:33.524333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:33.524294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-954dd468f-bvk7r" event={"ID":"2f956b7a-e61e-497e-bf6a-382a5c64e13e","Type":"ContainerStarted","Data":"ac47bfab2d6ff2612a76b001e4e90ee5c89047d66901168a86a85e229e47173c"} Apr 22 19:33:33.524333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:33.524338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-954dd468f-bvk7r" event={"ID":"2f956b7a-e61e-497e-bf6a-382a5c64e13e","Type":"ContainerStarted","Data":"8feb4072b267b6e87bc89a8fb2d7154735a3d22a82bd550290dae128a3d2884d"} Apr 22 19:33:33.544175 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:33.544130 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-954dd468f-bvk7r" podStartSLOduration=1.544116528 podStartE2EDuration="1.544116528s" podCreationTimestamp="2026-04-22 19:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:33:33.542286916 +0000 UTC m=+563.327276796" watchObservedRunningTime="2026-04-22 19:33:33.544116528 +0000 UTC m=+563.329106400" Apr 22 19:33:42.916560 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:42.916487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:42.916560 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:42.916563 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:42.922178 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:42.922155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:43.560353 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:43.560326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-954dd468f-bvk7r" Apr 22 19:33:43.612204 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:33:43.612168 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:34:08.637469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.637407 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-777859b957-v65cx" podUID="6a673d89-1c34-4821-9944-d9156391059a" containerName="console" containerID="cri-o://dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7" gracePeriod=15 Apr 22 19:34:08.879675 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.879652 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-777859b957-v65cx_6a673d89-1c34-4821-9944-d9156391059a/console/0.log" Apr 22 19:34:08.879782 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.879716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:34:08.982843 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982769 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.982843 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982885 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.982969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddr4\" (UniqueName: \"kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4\") pod \"6a673d89-1c34-4821-9944-d9156391059a\" (UID: \"6a673d89-1c34-4821-9944-d9156391059a\") " Apr 22 19:34:08.983416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.983321 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:08.983496 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.983431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config" (OuterVolumeSpecName: "console-config") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:08.983496 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.983444 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:08.983604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.983576 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:08.985357 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.985315 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4" (OuterVolumeSpecName: "kube-api-access-pddr4") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "kube-api-access-pddr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:34:08.985569 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.985545 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:08.985665 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:08.985639 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a673d89-1c34-4821-9944-d9156391059a" (UID: "6a673d89-1c34-4821-9944-d9156391059a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:09.083954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083921 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.083954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083948 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.083954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083959 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.084170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083967 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.084170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083976 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a673d89-1c34-4821-9944-d9156391059a-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.084170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083985 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pddr4\" (UniqueName: \"kubernetes.io/projected/6a673d89-1c34-4821-9944-d9156391059a-kube-api-access-pddr4\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.084170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.083995 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a673d89-1c34-4821-9944-d9156391059a-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:34:09.647539 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-777859b957-v65cx_6a673d89-1c34-4821-9944-d9156391059a/console/0.log" Apr 22 19:34:09.647933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647556 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a673d89-1c34-4821-9944-d9156391059a" containerID="dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7" exitCode=2 Apr 22 19:34:09.647933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647617 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-777859b957-v65cx" Apr 22 19:34:09.647933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777859b957-v65cx" event={"ID":"6a673d89-1c34-4821-9944-d9156391059a","Type":"ContainerDied","Data":"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7"} Apr 22 19:34:09.647933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-777859b957-v65cx" event={"ID":"6a673d89-1c34-4821-9944-d9156391059a","Type":"ContainerDied","Data":"3afb93d71d66e4e621597e0cb3aef5ebe0a3bb767a8c105b34873e8401948a40"} Apr 22 19:34:09.647933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.647670 2576 scope.go:117] "RemoveContainer" containerID="dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7" Apr 22 19:34:09.655718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.655698 2576 scope.go:117] "RemoveContainer" containerID="dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7" Apr 22 19:34:09.655961 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:34:09.655940 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7\": container with ID starting with dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7 not found: ID does not exist" containerID="dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7" Apr 22 19:34:09.656014 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.655971 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7"} err="failed to get container status \"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7\": rpc error: code = NotFound desc = could not find container \"dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7\": container with ID starting with dbfa6d4990c481dc01bbf8ed591f21bdc60f4893a5a02f52c7d74a7fe7b22fc7 not found: ID does not exist" Apr 22 19:34:09.673421 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.673399 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:34:09.679620 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:09.679599 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-777859b957-v65cx"] Apr 22 19:34:10.669686 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:10.669659 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:34:10.670325 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:10.670304 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:34:10.673052 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:10.673035 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:34:10.673776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:10.673758 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:34:10.754548 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:10.754497 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a673d89-1c34-4821-9944-d9156391059a" path="/var/lib/kubelet/pods/6a673d89-1c34-4821-9944-d9156391059a/volumes" Apr 22 19:34:13.906400 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.906342 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt"] Apr 22 19:34:13.906922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.906781 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a673d89-1c34-4821-9944-d9156391059a" containerName="console" Apr 22 19:34:13.906922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.906795 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a673d89-1c34-4821-9944-d9156391059a" containerName="console" Apr 22 19:34:13.906922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.906877 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a673d89-1c34-4821-9944-d9156391059a" containerName="console" Apr 22 19:34:13.909479 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.909463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:13.911978 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.911951 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 19:34:13.912197 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.911953 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 19:34:13.912197 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.912192 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:34:13.912382 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.912325 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-gl48n\"" Apr 22 19:34:13.912457 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.912438 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:34:13.920732 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:13.920711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt"] Apr 22 19:34:14.026681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.026649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq65v\" (UniqueName: \"kubernetes.io/projected/a2e34b1d-33ec-4538-a67b-c74df6e93564-kube-api-access-hq65v\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.026846 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.026688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.026846 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.026754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2e34b1d-33ec-4538-a67b-c74df6e93564-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.127817 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.127779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq65v\" (UniqueName: \"kubernetes.io/projected/a2e34b1d-33ec-4538-a67b-c74df6e93564-kube-api-access-hq65v\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.127817 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.127820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.128021 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.127864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2e34b1d-33ec-4538-a67b-c74df6e93564-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.128021 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:34:14.128002 2576 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 19:34:14.128086 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:34:14.128073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert podName:a2e34b1d-33ec-4538-a67b-c74df6e93564 nodeName:}" failed. No retries permitted until 2026-04-22 19:34:14.62805613 +0000 UTC m=+604.413045981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-tbgzt" (UID: "a2e34b1d-33ec-4538-a67b-c74df6e93564") : secret "plugin-serving-cert" not found Apr 22 19:34:14.128588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.128571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a2e34b1d-33ec-4538-a67b-c74df6e93564-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.136708 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.136685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq65v\" (UniqueName: \"kubernetes.io/projected/a2e34b1d-33ec-4538-a67b-c74df6e93564-kube-api-access-hq65v\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.632051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.632019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.634432 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.634397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2e34b1d-33ec-4538-a67b-c74df6e93564-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-tbgzt\" (UID: \"a2e34b1d-33ec-4538-a67b-c74df6e93564\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.819050 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.819009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" Apr 22 19:34:14.940725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:14.940702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt"] Apr 22 19:34:14.943182 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:34:14.943153 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2e34b1d_33ec_4538_a67b_c74df6e93564.slice/crio-be4b0ecba7d72da78b34c104997eef1d81032adb5de3f16df381b13a6692143b WatchSource:0}: Error finding container be4b0ecba7d72da78b34c104997eef1d81032adb5de3f16df381b13a6692143b: Status 404 returned error can't find the container with id be4b0ecba7d72da78b34c104997eef1d81032adb5de3f16df381b13a6692143b Apr 22 19:34:15.672578 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:15.672527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" event={"ID":"a2e34b1d-33ec-4538-a67b-c74df6e93564","Type":"ContainerStarted","Data":"be4b0ecba7d72da78b34c104997eef1d81032adb5de3f16df381b13a6692143b"} Apr 22 19:34:20.696353 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:20.696310 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" event={"ID":"a2e34b1d-33ec-4538-a67b-c74df6e93564","Type":"ContainerStarted","Data":"b1aea820ae2e27f52561183ffe0830f253fd0ba1b622a1619b949cc046e880a4"} Apr 22 19:34:20.716409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:20.716364 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-tbgzt" podStartSLOduration=2.809968935 podStartE2EDuration="7.716350157s" podCreationTimestamp="2026-04-22 19:34:13 +0000 UTC" firstStartedPulling="2026-04-22 19:34:14.944489002 +0000 UTC m=+604.729478855" lastFinishedPulling="2026-04-22 19:34:19.85087021 +0000 UTC m=+609.635860077" observedRunningTime="2026-04-22 19:34:20.715691001 +0000 UTC m=+610.500680884" watchObservedRunningTime="2026-04-22 19:34:20.716350157 +0000 UTC m=+610.501340060" Apr 22 19:34:57.047737 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.047700 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:34:57.051179 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.051148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:34:57.053554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.053490 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wmf2l\"" Apr 22 19:34:57.057566 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.057545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:34:57.155815 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.155783 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:34:57.158986 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.158972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:34:57.167339 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.167311 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:34:57.212578 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.212552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbtg\" (UniqueName: \"kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg\") pod \"authorino-674b59b84c-g27gs\" (UID: \"9abb94d5-a1c5-4e21-b67e-5f86130865f1\") " pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:34:57.313236 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.313161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbrt\" (UniqueName: \"kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt\") pod \"authorino-79cbc94b89-cbpg6\" (UID: \"3951a598-b1da-425f-98b3-962a2340a58c\") " pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:34:57.313236 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.313222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbtg\" (UniqueName: \"kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg\") pod \"authorino-674b59b84c-g27gs\" (UID: \"9abb94d5-a1c5-4e21-b67e-5f86130865f1\") " pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:34:57.321211 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.321192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbtg\" (UniqueName: \"kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg\") pod \"authorino-674b59b84c-g27gs\" (UID: \"9abb94d5-a1c5-4e21-b67e-5f86130865f1\") " pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:34:57.362129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.362106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:34:57.414187 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.414147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbrt\" (UniqueName: \"kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt\") pod \"authorino-79cbc94b89-cbpg6\" (UID: \"3951a598-b1da-425f-98b3-962a2340a58c\") " pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:34:57.423224 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.423179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbrt\" (UniqueName: \"kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt\") pod \"authorino-79cbc94b89-cbpg6\" (UID: \"3951a598-b1da-425f-98b3-962a2340a58c\") " pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:34:57.468129 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.468099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:34:57.596073 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.596047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:34:57.598807 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:34:57.598777 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3951a598_b1da_425f_98b3_962a2340a58c.slice/crio-e8c2f1d59f81c17d8ca0cfa302821b0d340d1ff9a0fe61b32724612ad1c4925c WatchSource:0}: Error finding container e8c2f1d59f81c17d8ca0cfa302821b0d340d1ff9a0fe61b32724612ad1c4925c: Status 404 returned error can't find the container with id e8c2f1d59f81c17d8ca0cfa302821b0d340d1ff9a0fe61b32724612ad1c4925c Apr 22 19:34:57.600062 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.600047 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:34:57.690259 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.690222 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:34:57.694177 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:34:57.694152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9abb94d5_a1c5_4e21_b67e_5f86130865f1.slice/crio-11ab6eae9102c0f52c4822607f858b2472b86469a923fb0dc38cc3998fffe592 WatchSource:0}: Error finding container 11ab6eae9102c0f52c4822607f858b2472b86469a923fb0dc38cc3998fffe592: Status 404 returned error can't find the container with id 11ab6eae9102c0f52c4822607f858b2472b86469a923fb0dc38cc3998fffe592 Apr 22 19:34:57.826958 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.826923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" event={"ID":"3951a598-b1da-425f-98b3-962a2340a58c","Type":"ContainerStarted","Data":"e8c2f1d59f81c17d8ca0cfa302821b0d340d1ff9a0fe61b32724612ad1c4925c"} Apr 22 19:34:57.827956 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:34:57.827932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g27gs" event={"ID":"9abb94d5-a1c5-4e21-b67e-5f86130865f1","Type":"ContainerStarted","Data":"11ab6eae9102c0f52c4822607f858b2472b86469a923fb0dc38cc3998fffe592"} Apr 22 19:35:00.840601 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:00.840558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" event={"ID":"3951a598-b1da-425f-98b3-962a2340a58c","Type":"ContainerStarted","Data":"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9"} Apr 22 19:35:00.841745 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:00.841723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g27gs" event={"ID":"9abb94d5-a1c5-4e21-b67e-5f86130865f1","Type":"ContainerStarted","Data":"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c"} Apr 22 19:35:00.860937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:00.860894 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" podStartSLOduration=1.187987599 podStartE2EDuration="3.860883447s" podCreationTimestamp="2026-04-22 19:34:57 +0000 UTC" firstStartedPulling="2026-04-22 19:34:57.600169566 +0000 UTC m=+647.385159416" lastFinishedPulling="2026-04-22 19:35:00.273065411 +0000 UTC m=+650.058055264" observedRunningTime="2026-04-22 19:35:00.859092552 +0000 UTC m=+650.644082424" watchObservedRunningTime="2026-04-22 19:35:00.860883447 +0000 UTC m=+650.645873318" Apr 22 19:35:00.876564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:00.876494 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-g27gs" podStartSLOduration=1.306098429 podStartE2EDuration="3.876485229s" podCreationTimestamp="2026-04-22 19:34:57 +0000 UTC" firstStartedPulling="2026-04-22 19:34:57.695665041 +0000 UTC m=+647.480654892" lastFinishedPulling="2026-04-22 19:35:00.266051841 +0000 UTC m=+650.051041692" observedRunningTime="2026-04-22 19:35:00.875147831 +0000 UTC m=+650.660137703" watchObservedRunningTime="2026-04-22 19:35:00.876485229 +0000 UTC m=+650.661475102" Apr 22 19:35:00.902834 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:00.902804 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:35:02.848309 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:02.848268 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-g27gs" podUID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" containerName="authorino" containerID="cri-o://372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c" gracePeriod=30 Apr 22 19:35:03.086858 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.086835 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:35:03.266312 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.266288 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqbtg\" (UniqueName: \"kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg\") pod \"9abb94d5-a1c5-4e21-b67e-5f86130865f1\" (UID: \"9abb94d5-a1c5-4e21-b67e-5f86130865f1\") " Apr 22 19:35:03.268472 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.268447 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg" (OuterVolumeSpecName: "kube-api-access-lqbtg") pod "9abb94d5-a1c5-4e21-b67e-5f86130865f1" (UID: "9abb94d5-a1c5-4e21-b67e-5f86130865f1"). InnerVolumeSpecName "kube-api-access-lqbtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:03.367815 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.367774 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lqbtg\" (UniqueName: \"kubernetes.io/projected/9abb94d5-a1c5-4e21-b67e-5f86130865f1-kube-api-access-lqbtg\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:35:03.852960 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.852924 2576 generic.go:358] "Generic (PLEG): container finished" podID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" containerID="372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c" exitCode=0 Apr 22 19:35:03.853431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.852977 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g27gs" Apr 22 19:35:03.853431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.852999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g27gs" event={"ID":"9abb94d5-a1c5-4e21-b67e-5f86130865f1","Type":"ContainerDied","Data":"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c"} Apr 22 19:35:03.853431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.853024 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g27gs" event={"ID":"9abb94d5-a1c5-4e21-b67e-5f86130865f1","Type":"ContainerDied","Data":"11ab6eae9102c0f52c4822607f858b2472b86469a923fb0dc38cc3998fffe592"} Apr 22 19:35:03.853431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.853040 2576 scope.go:117] "RemoveContainer" containerID="372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c" Apr 22 19:35:03.861773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.861748 2576 scope.go:117] "RemoveContainer" containerID="372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c" Apr 22 19:35:03.862014 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:35:03.861990 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c\": container with ID starting with 372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c not found: ID does not exist" containerID="372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c" Apr 22 19:35:03.862090 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.862021 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c"} err="failed to get container status \"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c\": rpc error: code = NotFound desc = could not find container \"372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c\": container with ID starting with 372ff9237b8fb8ae2630f9c2b612efb15ebeed41ea28b63dae46d5492f39f53c not found: ID does not exist" Apr 22 19:35:03.874045 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.874022 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:35:03.877301 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:03.877283 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g27gs"] Apr 22 19:35:04.754896 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:04.754867 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" path="/var/lib/kubelet/pods/9abb94d5-a1c5-4e21-b67e-5f86130865f1/volumes" Apr 22 19:35:20.981088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.981058 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-7sxnb"] Apr 22 19:35:20.981574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.981382 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" containerName="authorino" Apr 22 19:35:20.981574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.981393 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" containerName="authorino" Apr 22 19:35:20.981574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.981468 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9abb94d5-a1c5-4e21-b67e-5f86130865f1" containerName="authorino" Apr 22 19:35:20.988994 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.988973 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:20.991670 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.991646 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 19:35:20.994049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:20.994028 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-7sxnb"] Apr 22 19:35:21.115837 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.115804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzjv\" (UniqueName: \"kubernetes.io/projected/30840fff-33dc-4c20-8ad1-42a64abe6b20-kube-api-access-rdzjv\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.115996 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.115845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30840fff-33dc-4c20-8ad1-42a64abe6b20-tls-cert\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.217152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.217105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzjv\" (UniqueName: \"kubernetes.io/projected/30840fff-33dc-4c20-8ad1-42a64abe6b20-kube-api-access-rdzjv\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.217326 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.217166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30840fff-33dc-4c20-8ad1-42a64abe6b20-tls-cert\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.219723 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.219702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/30840fff-33dc-4c20-8ad1-42a64abe6b20-tls-cert\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.224259 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.224234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzjv\" (UniqueName: \"kubernetes.io/projected/30840fff-33dc-4c20-8ad1-42a64abe6b20-kube-api-access-rdzjv\") pod \"authorino-68bd676465-7sxnb\" (UID: \"30840fff-33dc-4c20-8ad1-42a64abe6b20\") " pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.299079 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.299045 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-7sxnb" Apr 22 19:35:21.422171 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.422052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-7sxnb"] Apr 22 19:35:21.424788 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:35:21.424757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30840fff_33dc_4c20_8ad1_42a64abe6b20.slice/crio-005c765febfba94da77d08cabb5021b976ef28d9c268509707981be1da6b2d32 WatchSource:0}: Error finding container 005c765febfba94da77d08cabb5021b976ef28d9c268509707981be1da6b2d32: Status 404 returned error can't find the container with id 005c765febfba94da77d08cabb5021b976ef28d9c268509707981be1da6b2d32 Apr 22 19:35:21.915407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.915367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-7sxnb" event={"ID":"30840fff-33dc-4c20-8ad1-42a64abe6b20","Type":"ContainerStarted","Data":"04631d80997caf44c117996515b6bbb70eb8f522a82ceabca51fa4359eb31c7e"} Apr 22 19:35:21.915535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.915414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-7sxnb" event={"ID":"30840fff-33dc-4c20-8ad1-42a64abe6b20","Type":"ContainerStarted","Data":"005c765febfba94da77d08cabb5021b976ef28d9c268509707981be1da6b2d32"} Apr 22 19:35:21.931232 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.931184 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-7sxnb" podStartSLOduration=1.521238006 podStartE2EDuration="1.931169518s" podCreationTimestamp="2026-04-22 19:35:20 +0000 UTC" firstStartedPulling="2026-04-22 19:35:21.426072621 +0000 UTC m=+671.211062472" lastFinishedPulling="2026-04-22 19:35:21.836004132 +0000 UTC m=+671.620993984" observedRunningTime="2026-04-22 19:35:21.93025558 +0000 UTC m=+671.715245453" watchObservedRunningTime="2026-04-22 19:35:21.931169518 +0000 UTC m=+671.716159390" Apr 22 19:35:21.963026 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.962995 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:35:21.963622 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:21.963413 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" podUID="3951a598-b1da-425f-98b3-962a2340a58c" containerName="authorino" containerID="cri-o://63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9" gracePeriod=30 Apr 22 19:35:22.192494 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.192461 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:35:22.225911 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.225866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbrt\" (UniqueName: \"kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt\") pod \"3951a598-b1da-425f-98b3-962a2340a58c\" (UID: \"3951a598-b1da-425f-98b3-962a2340a58c\") " Apr 22 19:35:22.227954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.227932 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt" (OuterVolumeSpecName: "kube-api-access-qvbrt") pod "3951a598-b1da-425f-98b3-962a2340a58c" (UID: "3951a598-b1da-425f-98b3-962a2340a58c"). InnerVolumeSpecName "kube-api-access-qvbrt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:22.327239 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.327206 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvbrt\" (UniqueName: \"kubernetes.io/projected/3951a598-b1da-425f-98b3-962a2340a58c-kube-api-access-qvbrt\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:35:22.919930 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.919893 2576 generic.go:358] "Generic (PLEG): container finished" podID="3951a598-b1da-425f-98b3-962a2340a58c" containerID="63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9" exitCode=0 Apr 22 19:35:22.920102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.919954 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" Apr 22 19:35:22.920102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.919977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" event={"ID":"3951a598-b1da-425f-98b3-962a2340a58c","Type":"ContainerDied","Data":"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9"} Apr 22 19:35:22.920102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.920013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-cbpg6" event={"ID":"3951a598-b1da-425f-98b3-962a2340a58c","Type":"ContainerDied","Data":"e8c2f1d59f81c17d8ca0cfa302821b0d340d1ff9a0fe61b32724612ad1c4925c"} Apr 22 19:35:22.920102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.920030 2576 scope.go:117] "RemoveContainer" containerID="63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9" Apr 22 19:35:22.927768 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.927741 2576 scope.go:117] "RemoveContainer" containerID="63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9" Apr 22 19:35:22.928021 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:35:22.928002 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9\": container with ID starting with 63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9 not found: ID does not exist" containerID="63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9" Apr 22 19:35:22.928080 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.928030 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9"} err="failed to get container status \"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9\": rpc error: code = NotFound desc = could not find container \"63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9\": container with ID starting with 63e838d8e77c4542d0ec490f8e2151aa89879caf6cfb7caf7ff1c73d3f167cd9 not found: ID does not exist" Apr 22 19:35:22.950374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.950339 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:35:22.953972 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:22.953948 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-cbpg6"] Apr 22 19:35:24.754697 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:35:24.754663 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3951a598-b1da-425f-98b3-962a2340a58c" path="/var/lib/kubelet/pods/3951a598-b1da-425f-98b3-962a2340a58c/volumes" Apr 22 19:37:38.012865 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.012831 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:37:38.013385 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.013365 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3951a598-b1da-425f-98b3-962a2340a58c" containerName="authorino" Apr 22 19:37:38.013428 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.013391 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3951a598-b1da-425f-98b3-962a2340a58c" containerName="authorino" Apr 22 19:37:38.013544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.013530 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3951a598-b1da-425f-98b3-962a2340a58c" containerName="authorino" Apr 22 19:37:38.016960 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.016933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.020544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.020495 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:37:38.020677 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.020495 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 19:37:38.020677 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.020640 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:37:38.020800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.020603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k6ghc\"" Apr 22 19:37:38.026760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.026741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:37:38.104315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.104532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.104532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.104532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8zmj\" (UniqueName: \"kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.104532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.104532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.104487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205074 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205030 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8zmj\" (UniqueName: \"kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205264 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205560 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205560 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205479 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.205703 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.205683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.207389 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.207368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.207749 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.207730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.213144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.213123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8zmj\" (UniqueName: \"kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj\") pod \"scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.330206 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.330169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:38.470375 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:38.470350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:37:38.472493 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:37:38.472461 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaba542da_4d81_471c_8366_0ea17f8add0d.slice/crio-d9088608f2c3a2a4bd2863b7fdf4cfe0a735b8c79220974ac76ee36b1f9bb1ee WatchSource:0}: Error finding container d9088608f2c3a2a4bd2863b7fdf4cfe0a735b8c79220974ac76ee36b1f9bb1ee: Status 404 returned error can't find the container with id d9088608f2c3a2a4bd2863b7fdf4cfe0a735b8c79220974ac76ee36b1f9bb1ee Apr 22 19:37:39.392863 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:39.392819 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerStarted","Data":"d9088608f2c3a2a4bd2863b7fdf4cfe0a735b8c79220974ac76ee36b1f9bb1ee"} Apr 22 19:37:42.406732 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:42.406688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerStarted","Data":"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da"} Apr 22 19:37:47.426823 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:47.426785 2576 generic.go:358] "Generic (PLEG): container finished" podID="aba542da-4d81-471c-8366-0ea17f8add0d" containerID="0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da" exitCode=0 Apr 22 19:37:47.427204 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:47.426863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerDied","Data":"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da"} Apr 22 19:37:49.435345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:49.435313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerStarted","Data":"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05"} Apr 22 19:37:49.456118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:49.456074 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" podStartSLOduration=2.480941391 podStartE2EDuration="12.456059814s" podCreationTimestamp="2026-04-22 19:37:37 +0000 UTC" firstStartedPulling="2026-04-22 19:37:38.474769256 +0000 UTC m=+808.259759105" lastFinishedPulling="2026-04-22 19:37:48.449887676 +0000 UTC m=+818.234877528" observedRunningTime="2026-04-22 19:37:49.453640437 +0000 UTC m=+819.238630309" watchObservedRunningTime="2026-04-22 19:37:49.456059814 +0000 UTC m=+819.241049686" Apr 22 19:37:52.016399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.016366 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:37:52.020327 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.020299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.022807 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.022782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 22 19:37:52.029779 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.029757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:37:52.136617 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.136792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.136792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.136792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.136792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gbf\" (UniqueName: \"kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.136792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.136737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.237808 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.237771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.237969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.237835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.237969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.237924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.237969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.237951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.238122 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.237977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.238122 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.238007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85gbf\" (UniqueName: \"kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.238251 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.238227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.238312 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.238283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.238369 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.238334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.240076 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.240057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.240434 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.240413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.245683 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.245664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gbf\" (UniqueName: \"kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.330802 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.330713 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:37:52.461833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:52.461761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:37:52.464675 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:37:52.464646 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c13715c_25b0_4867_899d_df953b8f6671.slice/crio-fd6f6a6da835d35490f5351cb385f173088281fba93c4e1171e7207efda53afc WatchSource:0}: Error finding container fd6f6a6da835d35490f5351cb385f173088281fba93c4e1171e7207efda53afc: Status 404 returned error can't find the container with id fd6f6a6da835d35490f5351cb385f173088281fba93c4e1171e7207efda53afc Apr 22 19:37:53.450602 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:53.450559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerStarted","Data":"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e"} Apr 22 19:37:53.450602 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:53.450599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerStarted","Data":"fd6f6a6da835d35490f5351cb385f173088281fba93c4e1171e7207efda53afc"} Apr 22 19:37:58.331088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.331039 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:58.331541 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.331131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:58.344089 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.344058 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:58.469949 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.469909 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c13715c-25b0-4867-899d-df953b8f6671" containerID="9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e" exitCode=0 Apr 22 19:37:58.470104 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.469983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerDied","Data":"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e"} Apr 22 19:37:58.482119 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:58.482100 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:37:59.475834 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:59.475801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerStarted","Data":"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d"} Apr 22 19:37:59.497593 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:37:59.497548 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" podStartSLOduration=8.497533225 podStartE2EDuration="8.497533225s" podCreationTimestamp="2026-04-22 19:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:37:59.495210091 +0000 UTC m=+829.280199954" watchObservedRunningTime="2026-04-22 19:37:59.497533225 +0000 UTC m=+829.282523096" Apr 22 19:38:02.331337 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:02.331299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:38:02.331337 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:02.331347 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:38:02.344024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:02.343998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:38:02.503920 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:02.503890 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:38:30.254099 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.254068 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:38:30.254553 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.254364 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="main" containerID="cri-o://12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05" gracePeriod=30 Apr 22 19:38:30.503900 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.503871 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:38:30.571251 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571217 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571272 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8zmj\" (UniqueName: \"kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571326 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571377 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571580 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571417 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571580 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571439 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache\") pod \"aba542da-4d81-471c-8366-0ea17f8add0d\" (UID: \"aba542da-4d81-471c-8366-0ea17f8add0d\") " Apr 22 19:38:30.571773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571744 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home" (OuterVolumeSpecName: "home") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:30.571892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.571767 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache" (OuterVolumeSpecName: "model-cache") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:30.573463 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.573440 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:38:30.573562 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.573548 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm" (OuterVolumeSpecName: "dshm") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:30.573740 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.573726 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj" (OuterVolumeSpecName: "kube-api-access-q8zmj") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "kube-api-access-q8zmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:38:30.594731 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.594703 2576 generic.go:358] "Generic (PLEG): container finished" podID="aba542da-4d81-471c-8366-0ea17f8add0d" containerID="12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05" exitCode=0 Apr 22 19:38:30.594831 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.594787 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" Apr 22 19:38:30.594831 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.594791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerDied","Data":"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05"} Apr 22 19:38:30.594831 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.594830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5" event={"ID":"aba542da-4d81-471c-8366-0ea17f8add0d","Type":"ContainerDied","Data":"d9088608f2c3a2a4bd2863b7fdf4cfe0a735b8c79220974ac76ee36b1f9bb1ee"} Apr 22 19:38:30.594967 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.594846 2576 scope.go:117] "RemoveContainer" containerID="12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05" Apr 22 19:38:30.603090 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.603063 2576 scope.go:117] "RemoveContainer" containerID="0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da" Apr 22 19:38:30.630333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.630305 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aba542da-4d81-471c-8366-0ea17f8add0d" (UID: "aba542da-4d81-471c-8366-0ea17f8add0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:38:30.663471 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.663450 2576 scope.go:117] "RemoveContainer" containerID="12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05" Apr 22 19:38:30.663800 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:38:30.663778 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05\": container with ID starting with 12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05 not found: ID does not exist" containerID="12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05" Apr 22 19:38:30.663873 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.663813 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05"} err="failed to get container status \"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05\": rpc error: code = NotFound desc = could not find container \"12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05\": container with ID starting with 12bcc825da6014d330060ef4f3afc6f66f46354092a7d3fe3b66a50d65876a05 not found: ID does not exist" Apr 22 19:38:30.663873 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.663833 2576 scope.go:117] "RemoveContainer" containerID="0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da" Apr 22 19:38:30.664099 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:38:30.664080 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da\": container with ID starting with 0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da not found: ID does not exist" containerID="0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da" Apr 22 19:38:30.664149 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.664102 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da"} err="failed to get container status \"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da\": rpc error: code = NotFound desc = could not find container \"0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da\": container with ID starting with 0a372a90fa4d7125765dfe6d9dd95309c36c1635e97905ea10097db123d064da not found: ID does not exist" Apr 22 19:38:30.672594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672569 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.672594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672592 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.672718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672604 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.672718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672614 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aba542da-4d81-471c-8366-0ea17f8add0d-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.672718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672624 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8zmj\" (UniqueName: \"kubernetes.io/projected/aba542da-4d81-471c-8366-0ea17f8add0d-kube-api-access-q8zmj\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.672718 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.672633 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/aba542da-4d81-471c-8366-0ea17f8add0d-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:38:30.911847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.911777 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:38:30.915828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:30.915805 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-fbdfdc549-h7gn5"] Apr 22 19:38:32.754834 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:32.754800 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" path="/var/lib/kubelet/pods/aba542da-4d81-471c-8366-0ea17f8add0d/volumes" Apr 22 19:38:36.497359 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:38:36.497927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497841 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="storage-initializer" Apr 22 19:38:36.497927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497864 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="storage-initializer" Apr 22 19:38:36.497927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497879 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="main" Apr 22 19:38:36.497927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497890 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="main" Apr 22 19:38:36.498144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.497993 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="aba542da-4d81-471c-8366-0ea17f8add0d" containerName="main" Apr 22 19:38:36.503152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.503130 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.511706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.511679 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 19:38:36.512267 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.512244 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-nd9cm\"" Apr 22 19:38:36.513676 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.513656 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:38:36.522779 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtqz\" (UniqueName: \"kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.522869 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.522869 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.522961 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.522961 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.522961 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.522944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.623944 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.623916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtqz\" (UniqueName: \"kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.623975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624118 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624461 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624461 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624576 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.624576 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.624549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.626433 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.626411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.631981 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.631960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtqz\" (UniqueName: \"kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:36.813370 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:36.813342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:38:37.147335 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:37.147153 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:38:37.150313 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:38:37.150277 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52365ba6_2172_47c7_bd87_aa4d07b917be.slice/crio-482b8346aefb7dffdf85119a8133ef1d15c2f70cb6995ec4b0426b973cc64c80 WatchSource:0}: Error finding container 482b8346aefb7dffdf85119a8133ef1d15c2f70cb6995ec4b0426b973cc64c80: Status 404 returned error can't find the container with id 482b8346aefb7dffdf85119a8133ef1d15c2f70cb6995ec4b0426b973cc64c80 Apr 22 19:38:37.626418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:37.626379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerStarted","Data":"87c019f09a24612fc151ea76d842977346c069af4720053833cd838632d2cd5a"} Apr 22 19:38:37.626418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:37.626419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerStarted","Data":"482b8346aefb7dffdf85119a8133ef1d15c2f70cb6995ec4b0426b973cc64c80"} Apr 22 19:38:38.631030 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:38.630991 2576 generic.go:358] "Generic (PLEG): container finished" podID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerID="87c019f09a24612fc151ea76d842977346c069af4720053833cd838632d2cd5a" exitCode=0 Apr 22 19:38:38.631395 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:38.631077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerDied","Data":"87c019f09a24612fc151ea76d842977346c069af4720053833cd838632d2cd5a"} Apr 22 19:38:40.641804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:38:40.641694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerStarted","Data":"3c9a8d9af275288f8176619a4fa5d3979937c2cdc8fa7a6bc3be6080f95ed8fa"} Apr 22 19:39:09.108541 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:09.108468 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:39:10.697806 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.697781 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:39:10.698713 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.698691 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:39:10.701563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.701542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:39:10.702350 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.702333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:39:10.761408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.761364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerStarted","Data":"e3aeba64d71aabff563ea9a92bd8b1ed92126e73a939ffbc649597177b995f0c"} Apr 22 19:39:10.761599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.761530 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" containerID="cri-o://3c9a8d9af275288f8176619a4fa5d3979937c2cdc8fa7a6bc3be6080f95ed8fa" gracePeriod=30 Apr 22 19:39:10.761599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.761553 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="tokenizer" containerID="cri-o://e3aeba64d71aabff563ea9a92bd8b1ed92126e73a939ffbc649597177b995f0c" gracePeriod=30 Apr 22 19:39:10.761729 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.761628 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:39:10.764559 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.764480 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 22 19:39:10.785720 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:10.785661 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podStartSLOduration=3.607931052 podStartE2EDuration="34.785645505s" podCreationTimestamp="2026-04-22 19:38:36 +0000 UTC" firstStartedPulling="2026-04-22 19:38:38.632179977 +0000 UTC m=+868.417169827" lastFinishedPulling="2026-04-22 19:39:09.809894418 +0000 UTC m=+899.594884280" observedRunningTime="2026-04-22 19:39:10.783329898 +0000 UTC m=+900.568319810" watchObservedRunningTime="2026-04-22 19:39:10.785645505 +0000 UTC m=+900.570635378" Apr 22 19:39:11.766763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:11.766721 2576 generic.go:358] "Generic (PLEG): container finished" podID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerID="3c9a8d9af275288f8176619a4fa5d3979937c2cdc8fa7a6bc3be6080f95ed8fa" exitCode=0 Apr 22 19:39:11.767124 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:11.766792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerDied","Data":"3c9a8d9af275288f8176619a4fa5d3979937c2cdc8fa7a6bc3be6080f95ed8fa"} Apr 22 19:39:16.813887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:16.813853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:39:18.633217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.633185 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:18.639734 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.639701 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.642839 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.642812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 19:39:18.646086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.646061 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:18.724292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.724292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.724535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.724535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.724535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.724535 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.724487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7p6\" (UniqueName: \"kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.824986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7p6\" (UniqueName: \"kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825526 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825596 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.825697 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.825612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.827359 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.827336 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.827628 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.827609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.834301 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.834279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7p6\" (UniqueName: \"kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6\") pod \"precise-prefix-cache-test-kserve-6b7d88c649-7vvf2\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:18.952147 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:18.952065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:19.078341 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:19.078318 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:19.080786 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:39:19.080756 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88755724_cace_4c21_87c0_afb6e0e4eb6f.slice/crio-59b994bddec825069141df09294e8625ea417a1c85b645ef5d22c9520634d3fe WatchSource:0}: Error finding container 59b994bddec825069141df09294e8625ea417a1c85b645ef5d22c9520634d3fe: Status 404 returned error can't find the container with id 59b994bddec825069141df09294e8625ea417a1c85b645ef5d22c9520634d3fe Apr 22 19:39:19.796084 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:19.796045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerStarted","Data":"3195a31f4fe2330dfc6b16b5568af2655ebd4d8b0ce9204542829773d149f5eb"} Apr 22 19:39:19.796084 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:19.796086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerStarted","Data":"59b994bddec825069141df09294e8625ea417a1c85b645ef5d22c9520634d3fe"} Apr 22 19:39:20.399826 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.399794 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:39:20.400124 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.400073 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="main" containerID="cri-o://2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d" gracePeriod=30 Apr 22 19:39:20.667050 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.667019 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:39:20.742445 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742412 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742655 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742472 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742655 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742526 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85gbf\" (UniqueName: \"kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742655 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742589 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742658 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742683 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location\") pod \"5c13715c-25b0-4867-899d-df953b8f6671\" (UID: \"5c13715c-25b0-4867-899d-df953b8f6671\") " Apr 22 19:39:20.742816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742736 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home" (OuterVolumeSpecName: "home") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:20.742816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742751 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache" (OuterVolumeSpecName: "model-cache") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:20.743000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.742981 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:20.743053 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.743000 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:20.744946 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.744919 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:20.745065 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.744948 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf" (OuterVolumeSpecName: "kube-api-access-85gbf") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "kube-api-access-85gbf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:20.745360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.745339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm" (OuterVolumeSpecName: "dshm") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:20.762862 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:39:20.762841 2576 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 22 19:39:20.799328 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.799285 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c13715c-25b0-4867-899d-df953b8f6671" (UID: "5c13715c-25b0-4867-899d-df953b8f6671"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:20.801408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.801378 2576 generic.go:358] "Generic (PLEG): container finished" podID="5c13715c-25b0-4867-899d-df953b8f6671" containerID="2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d" exitCode=0 Apr 22 19:39:20.801558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.801452 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" Apr 22 19:39:20.801558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.801469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerDied","Data":"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d"} Apr 22 19:39:20.801558 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.801537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd" event={"ID":"5c13715c-25b0-4867-899d-df953b8f6671","Type":"ContainerDied","Data":"fd6f6a6da835d35490f5351cb385f173088281fba93c4e1171e7207efda53afc"} Apr 22 19:39:20.801671 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.801559 2576 scope.go:117] "RemoveContainer" containerID="2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d" Apr 22 19:39:20.810583 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.810303 2576 scope.go:117] "RemoveContainer" containerID="9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e" Apr 22 19:39:20.824001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.823975 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:39:20.826045 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.826019 2576 scope.go:117] "RemoveContainer" containerID="2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d" Apr 22 19:39:20.826375 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:39:20.826346 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d\": container with ID starting with 2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d not found: ID does not exist" containerID="2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d" Apr 22 19:39:20.826465 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.826383 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d"} err="failed to get container status \"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d\": rpc error: code = NotFound desc = could not find container \"2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d\": container with ID starting with 2a2077d7efa35cb8a7a36f760aec05d501c967b00c9d82a8baa888195f9d7c9d not found: ID does not exist" Apr 22 19:39:20.826465 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.826407 2576 scope.go:117] "RemoveContainer" containerID="9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e" Apr 22 19:39:20.826724 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:39:20.826698 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e\": container with ID starting with 9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e not found: ID does not exist" containerID="9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e" Apr 22 19:39:20.826833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.826732 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e"} err="failed to get container status \"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e\": rpc error: code = NotFound desc = could not find container \"9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e\": container with ID starting with 9d0388b5700a3e2e515ac329266b87c107225de7b8f76600883f96fab8ad2b7e not found: ID does not exist" Apr 22 19:39:20.827520 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.827483 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-86b77bd597r77bd"] Apr 22 19:39:20.844448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.844423 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c13715c-25b0-4867-899d-df953b8f6671-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:20.844629 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.844452 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:20.844629 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.844475 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85gbf\" (UniqueName: \"kubernetes.io/projected/5c13715c-25b0-4867-899d-df953b8f6671-kube-api-access-85gbf\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:20.844629 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:20.844491 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c13715c-25b0-4867-899d-df953b8f6671-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:21.762833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:21.762772 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 22 19:39:22.756840 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:22.756797 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c13715c-25b0-4867-899d-df953b8f6671" path="/var/lib/kubelet/pods/5c13715c-25b0-4867-899d-df953b8f6671/volumes" Apr 22 19:39:23.817203 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:23.817169 2576 generic.go:358] "Generic (PLEG): container finished" podID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerID="3195a31f4fe2330dfc6b16b5568af2655ebd4d8b0ce9204542829773d149f5eb" exitCode=0 Apr 22 19:39:23.817670 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:23.817235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerDied","Data":"3195a31f4fe2330dfc6b16b5568af2655ebd4d8b0ce9204542829773d149f5eb"} Apr 22 19:39:24.822354 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.822317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerStarted","Data":"5b9c6ed7983940570d6bf481a458a68e11e5ca89fa60d49550872566ea4e678a"} Apr 22 19:39:24.842661 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.842610 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" podStartSLOduration=6.842594945 podStartE2EDuration="6.842594945s" podCreationTimestamp="2026-04-22 19:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:39:24.841340547 +0000 UTC m=+914.626330445" watchObservedRunningTime="2026-04-22 19:39:24.842594945 +0000 UTC m=+914.627584817" Apr 22 19:39:24.918170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918137 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:39:24.918525 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918512 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="storage-initializer" Apr 22 19:39:24.918572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918527 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="storage-initializer" Apr 22 19:39:24.918572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918540 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="main" Apr 22 19:39:24.918572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918546 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="main" Apr 22 19:39:24.918667 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.918604 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c13715c-25b0-4867-899d-df953b8f6671" containerName="main" Apr 22 19:39:24.939952 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.939918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:39:24.940098 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.939958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.942378 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.942354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 19:39:24.983647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.983841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.983841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.983841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.983841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:24.983841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:24.983816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frw85\" (UniqueName: \"kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085088 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085298 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085298 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085298 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085298 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frw85\" (UniqueName: \"kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085520 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085591 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.085645 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.085604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.087359 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.087334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.087653 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.087627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.094092 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.094071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frw85\" (UniqueName: \"kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.249647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.249615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:39:25.379870 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.379845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:39:25.382141 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:39:25.382108 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebcadc4a_e66b_43c5_8347_119fdb4d9578.slice/crio-d111857c7ff3dbde68ac143e23cbbafe6b6aab0f81fec4b9d1a5022ae882b461 WatchSource:0}: Error finding container d111857c7ff3dbde68ac143e23cbbafe6b6aab0f81fec4b9d1a5022ae882b461: Status 404 returned error can't find the container with id d111857c7ff3dbde68ac143e23cbbafe6b6aab0f81fec4b9d1a5022ae882b461 Apr 22 19:39:25.827572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.827533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerStarted","Data":"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184"} Apr 22 19:39:25.827572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:25.827575 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerStarted","Data":"d111857c7ff3dbde68ac143e23cbbafe6b6aab0f81fec4b9d1a5022ae882b461"} Apr 22 19:39:28.952301 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:28.952262 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:28.952848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:28.952448 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:28.965450 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:28.965426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:29.847318 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:29.847232 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerID="8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184" exitCode=0 Apr 22 19:39:29.847318 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:29.847274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerDied","Data":"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184"} Apr 22 19:39:29.859160 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:29.859136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:30.762516 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:39:30.762469 2576 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 22 19:39:31.763110 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:31.763065 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 22 19:39:40.762950 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:39:40.762921 2576 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 22 19:39:40.892655 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:40.892618 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf_52365ba6-2172-47c7-bd87-aa4d07b917be/tokenizer/0.log" Apr 22 19:39:40.893334 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:40.893303 2576 generic.go:358] "Generic (PLEG): container finished" podID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerID="e3aeba64d71aabff563ea9a92bd8b1ed92126e73a939ffbc649597177b995f0c" exitCode=137 Apr 22 19:39:40.893462 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:40.893376 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerDied","Data":"e3aeba64d71aabff563ea9a92bd8b1ed92126e73a939ffbc649597177b995f0c"} Apr 22 19:39:41.491072 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.490988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf_52365ba6-2172-47c7-bd87-aa4d07b917be/tokenizer/0.log" Apr 22 19:39:41.491994 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.491968 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:39:41.655914 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.655839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.655914 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.655897 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jtqz\" (UniqueName: \"kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.656136 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.655983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.656136 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.656136 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.656136 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656112 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location\") pod \"52365ba6-2172-47c7-bd87-aa4d07b917be\" (UID: \"52365ba6-2172-47c7-bd87-aa4d07b917be\") " Apr 22 19:39:41.656343 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656301 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:41.656464 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656439 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:41.656549 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656458 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.656744 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.656716 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:41.657113 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.657090 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:41.658578 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.658545 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:41.658703 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.658553 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz" (OuterVolumeSpecName: "kube-api-access-8jtqz") pod "52365ba6-2172-47c7-bd87-aa4d07b917be" (UID: "52365ba6-2172-47c7-bd87-aa4d07b917be"). InnerVolumeSpecName "kube-api-access-8jtqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:41.757397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.757365 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.757397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.757393 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52365ba6-2172-47c7-bd87-aa4d07b917be-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.757397 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.757404 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.757658 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.757413 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8jtqz\" (UniqueName: \"kubernetes.io/projected/52365ba6-2172-47c7-bd87-aa4d07b917be-kube-api-access-8jtqz\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.757658 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.757423 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/52365ba6-2172-47c7-bd87-aa4d07b917be-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:41.762674 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.762638 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 22 19:39:41.898120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.898093 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf_52365ba6-2172-47c7-bd87-aa4d07b917be/tokenizer/0.log" Apr 22 19:39:41.898880 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.898863 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" Apr 22 19:39:41.898880 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.898868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf" event={"ID":"52365ba6-2172-47c7-bd87-aa4d07b917be","Type":"ContainerDied","Data":"482b8346aefb7dffdf85119a8133ef1d15c2f70cb6995ec4b0426b973cc64c80"} Apr 22 19:39:41.899034 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.898910 2576 scope.go:117] "RemoveContainer" containerID="e3aeba64d71aabff563ea9a92bd8b1ed92126e73a939ffbc649597177b995f0c" Apr 22 19:39:41.907656 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.907639 2576 scope.go:117] "RemoveContainer" containerID="3c9a8d9af275288f8176619a4fa5d3979937c2cdc8fa7a6bc3be6080f95ed8fa" Apr 22 19:39:41.915078 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.915061 2576 scope.go:117] "RemoveContainer" containerID="87c019f09a24612fc151ea76d842977346c069af4720053833cd838632d2cd5a" Apr 22 19:39:41.921495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.921473 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:39:41.927387 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:41.927364 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-dc67bb88zmlf"] Apr 22 19:39:42.755806 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:42.755772 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" path="/var/lib/kubelet/pods/52365ba6-2172-47c7-bd87-aa4d07b917be/volumes" Apr 22 19:39:53.174361 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:53.174325 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:53.293100 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:53.174728 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="main" containerID="cri-o://5b9c6ed7983940570d6bf481a458a68e11e5ca89fa60d49550872566ea4e678a" gracePeriod=30 Apr 22 19:39:53.947974 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:53.947824 2576 generic.go:358] "Generic (PLEG): container finished" podID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerID="5b9c6ed7983940570d6bf481a458a68e11e5ca89fa60d49550872566ea4e678a" exitCode=0 Apr 22 19:39:53.947974 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:53.947878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerDied","Data":"5b9c6ed7983940570d6bf481a458a68e11e5ca89fa60d49550872566ea4e678a"} Apr 22 19:39:53.972311 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:53.972287 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:54.082452 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082452 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082411 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082452 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082431 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082457 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082476 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk7p6\" (UniqueName: \"kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location\") pod \"88755724-cace-4c21-87c0-afb6e0e4eb6f\" (UID: \"88755724-cace-4c21-87c0-afb6e0e4eb6f\") " Apr 22 19:39:54.082758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082608 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home" (OuterVolumeSpecName: "home") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:54.082758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082751 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache" (OuterVolumeSpecName: "model-cache") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:54.082984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082948 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.082984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.082969 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.084603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.084570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm" (OuterVolumeSpecName: "dshm") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:54.084717 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.084645 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:54.084717 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.084660 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6" (OuterVolumeSpecName: "kube-api-access-sk7p6") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "kube-api-access-sk7p6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:54.136468 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.136427 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88755724-cace-4c21-87c0-afb6e0e4eb6f" (UID: "88755724-cace-4c21-87c0-afb6e0e4eb6f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:54.184544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.184487 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sk7p6\" (UniqueName: \"kubernetes.io/projected/88755724-cace-4c21-87c0-afb6e0e4eb6f-kube-api-access-sk7p6\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.184544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.184541 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.184544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.184553 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88755724-cace-4c21-87c0-afb6e0e4eb6f-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.185096 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.184562 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/88755724-cace-4c21-87c0-afb6e0e4eb6f-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:39:54.952562 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.952531 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" Apr 22 19:39:54.952562 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.952541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2" event={"ID":"88755724-cace-4c21-87c0-afb6e0e4eb6f","Type":"ContainerDied","Data":"59b994bddec825069141df09294e8625ea417a1c85b645ef5d22c9520634d3fe"} Apr 22 19:39:54.952787 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.952587 2576 scope.go:117] "RemoveContainer" containerID="5b9c6ed7983940570d6bf481a458a68e11e5ca89fa60d49550872566ea4e678a" Apr 22 19:39:54.961145 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.961136 2576 scope.go:117] "RemoveContainer" containerID="3195a31f4fe2330dfc6b16b5568af2655ebd4d8b0ce9204542829773d149f5eb" Apr 22 19:39:54.970365 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.970344 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:54.973923 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:54.973899 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-6b7d88c649-7vvf2"] Apr 22 19:39:56.756220 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:39:56.756190 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" path="/var/lib/kubelet/pods/88755724-cace-4c21-87c0-afb6e0e4eb6f/volumes" Apr 22 19:40:03.799497 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.799462 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:03.800018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.799953 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="storage-initializer" Apr 22 19:40:03.800018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.799970 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="storage-initializer" Apr 22 19:40:03.800018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800005 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="storage-initializer" Apr 22 19:40:03.800018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800014 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="storage-initializer" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800025 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="tokenizer" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800034 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="tokenizer" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800059 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800073 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800082 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800160 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88755724-cace-4c21-87c0-afb6e0e4eb6f" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800175 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="main" Apr 22 19:40:03.800225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.800186 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="52365ba6-2172-47c7-bd87-aa4d07b917be" containerName="tokenizer" Apr 22 19:40:03.858331 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.858294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:03.858534 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.858443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.861306 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.861282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 22 19:40:03.978340 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.978563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.978563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978461 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.978563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z484g\" (UniqueName: \"kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.978753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:03.978753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:03.978657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079371 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079371 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079371 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z484g\" (UniqueName: \"kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079681 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.079934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.080018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.080018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.079971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.081795 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.081773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.082187 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.082168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.087442 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.087418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z484g\" (UniqueName: \"kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g\") pod \"conv-test-lora-crit-kserve-78c7c6d56c-cxt5v\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:04.170169 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:04.170134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:15.505307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:15.505274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:15.506595 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:40:15.506563 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8186cecf_89c7_4ff1_9273_00b6b6cfbc6c.slice/crio-05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b WatchSource:0}: Error finding container 05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b: Status 404 returned error can't find the container with id 05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b Apr 22 19:40:15.508459 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:15.508441 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:40:16.037554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:16.037483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" event={"ID":"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c","Type":"ContainerStarted","Data":"2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16"} Apr 22 19:40:16.037554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:16.037539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" event={"ID":"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c","Type":"ContainerStarted","Data":"05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b"} Apr 22 19:40:16.039329 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:16.039298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerStarted","Data":"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b"} Apr 22 19:40:16.077374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:16.077259 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podStartSLOduration=6.107443936 podStartE2EDuration="52.077243698s" podCreationTimestamp="2026-04-22 19:39:24 +0000 UTC" firstStartedPulling="2026-04-22 19:39:29.848524756 +0000 UTC m=+919.633514733" lastFinishedPulling="2026-04-22 19:40:15.818324643 +0000 UTC m=+965.603314495" observedRunningTime="2026-04-22 19:40:16.075886942 +0000 UTC m=+965.860876813" watchObservedRunningTime="2026-04-22 19:40:16.077243698 +0000 UTC m=+965.862233569" Apr 22 19:40:17.044568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:17.044538 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/0.log" Apr 22 19:40:17.044934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:17.044580 2576 generic.go:358] "Generic (PLEG): container finished" podID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerID="2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16" exitCode=1 Apr 22 19:40:17.044934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:17.044675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" event={"ID":"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c","Type":"ContainerDied","Data":"2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16"} Apr 22 19:40:18.049937 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.049912 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/1.log" Apr 22 19:40:18.050350 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.050329 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/0.log" Apr 22 19:40:18.050407 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.050387 2576 generic.go:358] "Generic (PLEG): container finished" podID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerID="c34c1d7a6ea15965d7104e272b3837f770c9f7a1017e46618ad00197b1a9c616" exitCode=1 Apr 22 19:40:18.050490 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.050469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" event={"ID":"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c","Type":"ContainerDied","Data":"c34c1d7a6ea15965d7104e272b3837f770c9f7a1017e46618ad00197b1a9c616"} Apr 22 19:40:18.050557 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.050543 2576 scope.go:117] "RemoveContainer" containerID="2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16" Apr 22 19:40:18.050835 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.050807 2576 scope.go:117] "RemoveContainer" containerID="2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16" Apr 22 19:40:18.062270 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:40:18.062233 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_kserve-ci-e2e-test_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c_0 in pod sandbox 05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b from index: no such id: '2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16'" containerID="2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16" Apr 22 19:40:18.062388 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:18.062280 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_kserve-ci-e2e-test_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c_0 in pod sandbox 05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b from index: no such id: '2753723f74eb3d2a14445f044f95f5a4a6e7c7702b51a859cfb2ac524c1c6e16'" Apr 22 19:40:18.062531 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:40:18.062490 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_kserve-ci-e2e-test(8186cecf-89c7-4ff1-9273-00b6b6cfbc6c)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" Apr 22 19:40:19.058016 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:19.057985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/1.log" Apr 22 19:40:19.058620 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:40:19.058598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_kserve-ci-e2e-test(8186cecf-89c7-4ff1-9273-00b6b6cfbc6c)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" Apr 22 19:40:24.815466 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.815364 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:40:24.820896 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.820850 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:40:24.821072 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.821039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.823848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.823811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:40:24.823995 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.823839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-2ftsv\"" Apr 22 19:40:24.878281 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ctq\" (UniqueName: \"kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.878431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.878431 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.878532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.878532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.878532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.878522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979497 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979598 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.979696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.979657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ctq\" (UniqueName: \"kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.980071 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.980038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.980195 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.980108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.980195 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.980126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.980195 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.980170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.982086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.982063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:24.987087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:24.987064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ctq\" (UniqueName: \"kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:25.134756 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:25.134656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:25.249948 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:25.249901 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:40:25.250111 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:25.249964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:40:25.251368 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:25.251331 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:40:25.273583 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:25.273556 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:40:25.275428 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:40:25.275401 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfda955d0_0d8a_4923_8675_7996e855a02a.slice/crio-b3c05b3bde4053425f16699b83231afd1cf7cf9a24d724606b44f58fe2852110 WatchSource:0}: Error finding container b3c05b3bde4053425f16699b83231afd1cf7cf9a24d724606b44f58fe2852110: Status 404 returned error can't find the container with id b3c05b3bde4053425f16699b83231afd1cf7cf9a24d724606b44f58fe2852110 Apr 22 19:40:26.087887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:26.087852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerStarted","Data":"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958"} Apr 22 19:40:26.087887 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:26.087893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerStarted","Data":"b3c05b3bde4053425f16699b83231afd1cf7cf9a24d724606b44f58fe2852110"} Apr 22 19:40:27.092877 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.092840 2576 generic.go:358] "Generic (PLEG): container finished" podID="fda955d0-0d8a-4923-8675-7996e855a02a" containerID="bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958" exitCode=0 Apr 22 19:40:27.093253 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.092921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerDied","Data":"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958"} Apr 22 19:40:27.526552 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.526519 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:27.676222 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.676200 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/1.log" Apr 22 19:40:27.676376 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.676271 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:27.706020 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.705986 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706343 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706211 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706343 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706240 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z484g\" (UniqueName: \"kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706343 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706274 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location\") pod \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\" (UID: \"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c\") " Apr 22 19:40:27.706543 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706395 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache" (OuterVolumeSpecName: "model-cache") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:27.706543 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706432 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home" (OuterVolumeSpecName: "home") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:27.706828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706653 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:27.706828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706674 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:27.706828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.706791 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:27.708460 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.708432 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g" (OuterVolumeSpecName: "kube-api-access-z484g") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "kube-api-access-z484g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:40:27.708685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.708664 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm" (OuterVolumeSpecName: "dshm") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:27.708764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.708690 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" (UID: "8186cecf-89c7-4ff1-9273-00b6b6cfbc6c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:40:27.807528 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.807410 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:27.807528 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.807444 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z484g\" (UniqueName: \"kubernetes.io/projected/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kube-api-access-z484g\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:27.807528 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.807455 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:27.807528 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:27.807465 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:40:28.098755 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.098663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-78c7c6d56c-cxt5v_8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/storage-initializer/1.log" Apr 22 19:40:28.099218 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.098774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" event={"ID":"8186cecf-89c7-4ff1-9273-00b6b6cfbc6c","Type":"ContainerDied","Data":"05df811bd917b0d5dfb55e2ab582ce6e40e0c8225cffd8c98a25de07bd7c368b"} Apr 22 19:40:28.099218 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.098804 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v" Apr 22 19:40:28.099218 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.098814 2576 scope.go:117] "RemoveContainer" containerID="c34c1d7a6ea15965d7104e272b3837f770c9f7a1017e46618ad00197b1a9c616" Apr 22 19:40:28.101098 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.101071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerStarted","Data":"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73"} Apr 22 19:40:28.101244 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.101104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerStarted","Data":"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647"} Apr 22 19:40:28.101244 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.101204 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:28.122587 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.122539 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" podStartSLOduration=4.122520043 podStartE2EDuration="4.122520043s" podCreationTimestamp="2026-04-22 19:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:40:28.121609068 +0000 UTC m=+977.906598945" watchObservedRunningTime="2026-04-22 19:40:28.122520043 +0000 UTC m=+977.907509910" Apr 22 19:40:28.155804 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.155740 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:28.157977 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.157949 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-78c7c6d56c-cxt5v"] Apr 22 19:40:28.755894 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:28.755857 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" path="/var/lib/kubelet/pods/8186cecf-89c7-4ff1-9273-00b6b6cfbc6c/volumes" Apr 22 19:40:35.135087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:35.135052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:35.135087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:35.135089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:35.138051 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:35.138022 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:35.250293 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:35.250246 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:40:36.133548 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:36.133483 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:40:45.250563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:45.250510 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:40:55.250201 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:55.250146 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:40:57.138990 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:40:57.138963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:41:05.250376 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:05.250325 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:41:15.249969 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:15.249918 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:41:25.250762 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:25.250719 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:41:35.250288 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:35.250244 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:41:45.250354 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:45.250297 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8000/health\": dial tcp 10.134.0.41:8000: connect: connection refused" Apr 22 19:41:55.260055 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:55.260021 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:41:55.267702 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:41:55.267663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:42:00.973579 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:00.973531 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:42:00.974198 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:00.973802 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" containerID="cri-o://1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b" gracePeriod=30 Apr 22 19:42:31.220876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.220854 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf_ebcadc4a-e66b-43c5-8347-119fdb4d9578/main/0.log" Apr 22 19:42:31.221280 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.221264 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:42:31.257994 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.257915 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258140 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258140 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258069 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258140 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258090 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258140 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258195 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frw85\" (UniqueName: \"kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85\") pod \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\" (UID: \"ebcadc4a-e66b-43c5-8347-119fdb4d9578\") " Apr 22 19:42:31.258360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258312 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache" (OuterVolumeSpecName: "model-cache") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:31.258476 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258447 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home" (OuterVolumeSpecName: "home") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:31.258590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.258539 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.260244 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.260211 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm" (OuterVolumeSpecName: "dshm") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:31.260369 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.260339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:42:31.260763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.260743 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85" (OuterVolumeSpecName: "kube-api-access-frw85") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "kube-api-access-frw85". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:42:31.312741 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.312708 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebcadc4a-e66b-43c5-8347-119fdb4d9578" (UID: "ebcadc4a-e66b-43c5-8347-119fdb4d9578"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:31.359746 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.359718 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frw85\" (UniqueName: \"kubernetes.io/projected/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kube-api-access-frw85\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.359847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.359747 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.359847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.359758 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.359847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.359767 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebcadc4a-e66b-43c5-8347-119fdb4d9578-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.359847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.359777 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebcadc4a-e66b-43c5-8347-119fdb4d9578-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:31.557300 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf_ebcadc4a-e66b-43c5-8347-119fdb4d9578/main/0.log" Apr 22 19:42:31.557620 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557600 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerID="1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b" exitCode=137 Apr 22 19:42:31.557672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerDied","Data":"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b"} Apr 22 19:42:31.557672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" event={"ID":"ebcadc4a-e66b-43c5-8347-119fdb4d9578","Type":"ContainerDied","Data":"d111857c7ff3dbde68ac143e23cbbafe6b6aab0f81fec4b9d1a5022ae882b461"} Apr 22 19:42:31.557743 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557674 2576 scope.go:117] "RemoveContainer" containerID="1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b" Apr 22 19:42:31.557743 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.557705 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf" Apr 22 19:42:31.577364 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.577316 2576 scope.go:117] "RemoveContainer" containerID="8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184" Apr 22 19:42:31.579810 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.579788 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:42:31.583749 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.583728 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-74798f754-r8wlf"] Apr 22 19:42:31.588124 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.588108 2576 scope.go:117] "RemoveContainer" containerID="1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b" Apr 22 19:42:31.588394 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:42:31.588366 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b\": container with ID starting with 1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b not found: ID does not exist" containerID="1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b" Apr 22 19:42:31.588448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.588402 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b"} err="failed to get container status \"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b\": rpc error: code = NotFound desc = could not find container \"1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b\": container with ID starting with 1835d96415b8ce0802c03b37e6ee07b1970c4fd0a588e3594ef3f8521776b23b not found: ID does not exist" Apr 22 19:42:31.588448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.588421 2576 scope.go:117] "RemoveContainer" containerID="8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184" Apr 22 19:42:31.588727 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:42:31.588707 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184\": container with ID starting with 8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184 not found: ID does not exist" containerID="8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184" Apr 22 19:42:31.588788 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:31.588732 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184"} err="failed to get container status \"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184\": rpc error: code = NotFound desc = could not find container \"8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184\": container with ID starting with 8b8c6f87e042366d145097de48e37af0ff3ee1a53deabf036cb73ea2c287d184 not found: ID does not exist" Apr 22 19:42:32.755570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:32.755537 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" path="/var/lib/kubelet/pods/ebcadc4a-e66b-43c5-8347-119fdb4d9578/volumes" Apr 22 19:42:35.978916 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:35.978878 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:42:35.979314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:35.979202 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="main" containerID="cri-o://d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647" gracePeriod=30 Apr 22 19:42:35.979387 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:35.979288 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="tokenizer" containerID="cri-o://bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73" gracePeriod=30 Apr 22 19:42:36.132758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:36.132715 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.43:8082/healthz\": dial tcp 10.134.0.43:8082: connect: connection refused" Apr 22 19:42:36.577736 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:36.577702 2576 generic.go:358] "Generic (PLEG): container finished" podID="fda955d0-0d8a-4923-8675-7996e855a02a" containerID="d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647" exitCode=0 Apr 22 19:42:36.577933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:36.577769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerDied","Data":"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647"} Apr 22 19:42:37.133876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.133855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:42:37.212375 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212328 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212400 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212428 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ctq\" (UniqueName: \"kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212549 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212614 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212609 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds\") pod \"fda955d0-0d8a-4923-8675-7996e855a02a\" (UID: \"fda955d0-0d8a-4923-8675-7996e855a02a\") " Apr 22 19:42:37.212893 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:37.212945 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212910 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:37.212993 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212938 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:37.212993 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.212986 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.213259 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.213238 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:37.214511 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.214479 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:42:37.214755 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.214735 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq" (OuterVolumeSpecName: "kube-api-access-z8ctq") pod "fda955d0-0d8a-4923-8675-7996e855a02a" (UID: "fda955d0-0d8a-4923-8675-7996e855a02a"). InnerVolumeSpecName "kube-api-access-z8ctq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:42:37.314054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.314021 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.314054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.314051 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fda955d0-0d8a-4923-8675-7996e855a02a-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.314240 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.314064 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.314240 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.314076 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8ctq\" (UniqueName: \"kubernetes.io/projected/fda955d0-0d8a-4923-8675-7996e855a02a-kube-api-access-z8ctq\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.314240 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.314089 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fda955d0-0d8a-4923-8675-7996e855a02a-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:42:37.583862 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.583773 2576 generic.go:358] "Generic (PLEG): container finished" podID="fda955d0-0d8a-4923-8675-7996e855a02a" containerID="bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73" exitCode=0 Apr 22 19:42:37.583862 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.583855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" Apr 22 19:42:37.584041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.583858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerDied","Data":"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73"} Apr 22 19:42:37.584041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.583895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74" event={"ID":"fda955d0-0d8a-4923-8675-7996e855a02a","Type":"ContainerDied","Data":"b3c05b3bde4053425f16699b83231afd1cf7cf9a24d724606b44f58fe2852110"} Apr 22 19:42:37.584041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.583910 2576 scope.go:117] "RemoveContainer" containerID="bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73" Apr 22 19:42:37.592450 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.592435 2576 scope.go:117] "RemoveContainer" containerID="d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647" Apr 22 19:42:37.599518 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.599486 2576 scope.go:117] "RemoveContainer" containerID="bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958" Apr 22 19:42:37.606205 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.606177 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:42:37.611601 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.611581 2576 scope.go:117] "RemoveContainer" containerID="bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73" Apr 22 19:42:37.611933 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:42:37.611907 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73\": container with ID starting with bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73 not found: ID does not exist" containerID="bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73" Apr 22 19:42:37.612036 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.611942 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73"} err="failed to get container status \"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73\": rpc error: code = NotFound desc = could not find container \"bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73\": container with ID starting with bf0e97a0c1f249239376ae618d4864d85d97470ece8cdeaf5ff2afed68020e73 not found: ID does not exist" Apr 22 19:42:37.612036 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.611968 2576 scope.go:117] "RemoveContainer" containerID="d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647" Apr 22 19:42:37.612968 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:42:37.612944 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647\": container with ID starting with d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647 not found: ID does not exist" containerID="d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647" Apr 22 19:42:37.613060 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.612973 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647"} err="failed to get container status \"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647\": rpc error: code = NotFound desc = could not find container \"d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647\": container with ID starting with d6aecca1b52a1db68d047fa27daea2cbd5adf47a6c388b672ef82a444b1dc647 not found: ID does not exist" Apr 22 19:42:37.613060 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.612993 2576 scope.go:117] "RemoveContainer" containerID="bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958" Apr 22 19:42:37.613422 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:42:37.613398 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958\": container with ID starting with bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958 not found: ID does not exist" containerID="bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958" Apr 22 19:42:37.613492 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.613430 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958"} err="failed to get container status \"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958\": rpc error: code = NotFound desc = could not find container \"bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958\": container with ID starting with bdadafc83ecb372acb8ac8842d6528b12c211f29204eb2d2a0a0a30665256958 not found: ID does not exist" Apr 22 19:42:37.614405 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:37.614390 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-cnd74"] Apr 22 19:42:38.754284 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:38.754253 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" path="/var/lib/kubelet/pods/fda955d0-0d8a-4923-8675-7996e855a02a/volumes" Apr 22 19:42:50.030310 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030276 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030624 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030636 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030645 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="tokenizer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030651 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="tokenizer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030665 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030672 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030681 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030687 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030695 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030700 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030709 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030715 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="storage-initializer" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="main" Apr 22 19:42:50.030753 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030731 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="main" Apr 22 19:42:50.031186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030783 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebcadc4a-e66b-43c5-8347-119fdb4d9578" containerName="main" Apr 22 19:42:50.031186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030790 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="main" Apr 22 19:42:50.031186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030799 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.031186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030806 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8186cecf-89c7-4ff1-9273-00b6b6cfbc6c" containerName="storage-initializer" Apr 22 19:42:50.031186 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.030813 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fda955d0-0d8a-4923-8675-7996e855a02a" containerName="tokenizer" Apr 22 19:42:50.033589 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.033571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.036282 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.036240 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 19:42:50.036282 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.036261 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 19:42:50.036282 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.036271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-k6ghc\"" Apr 22 19:42:50.036571 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.036246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 19:42:50.042149 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.042127 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:42:50.123107 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.123296 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.123296 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.123296 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxpbq\" (UniqueName: \"kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.123296 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.123486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.123324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224432 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxpbq\" (UniqueName: \"kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.224975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.224968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.225101 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.225016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.226780 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.226756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.227028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.227011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.232241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.232222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxpbq\" (UniqueName: \"kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq\") pod \"stop-feature-test-kserve-6688bd464-jnsh7\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.344886 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.344801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:42:50.354818 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.354795 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:42:50.357843 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.357809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.360550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.360525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-xgjtf\"" Apr 22 19:42:50.367423 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.367403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:42:50.427080 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.427172 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.427172 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7k77\" (UniqueName: \"kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.427172 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.427310 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.427310 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.427235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.477314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.477287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:42:50.479773 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:42:50.479741 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e26e47_fa1a_46d2_bb74_ba6ac35196be.slice/crio-0d90ec9bb26a11a8a7fd1445ebd13207b89bb9772f7fc2601e735aa26649b3de WatchSource:0}: Error finding container 0d90ec9bb26a11a8a7fd1445ebd13207b89bb9772f7fc2601e735aa26649b3de: Status 404 returned error can't find the container with id 0d90ec9bb26a11a8a7fd1445ebd13207b89bb9772f7fc2601e735aa26649b3de Apr 22 19:42:50.528219 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7k77\" (UniqueName: \"kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528652 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.528834 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.528776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.530739 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.530714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.536276 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.536253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7k77\" (UniqueName: \"kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77\") pod \"stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.631642 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.631568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerStarted","Data":"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457"} Apr 22 19:42:50.631642 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.631602 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerStarted","Data":"0d90ec9bb26a11a8a7fd1445ebd13207b89bb9772f7fc2601e735aa26649b3de"} Apr 22 19:42:50.680846 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.680810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:50.810997 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:50.810973 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:42:50.813382 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:42:50.813353 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f5fc4c3_44dd_4bdf_9748_d334bf449842.slice/crio-5dffb9e4e0346995bc1d2deee12e457ab0e1f67e15ef4ed335ea57ade01b2513 WatchSource:0}: Error finding container 5dffb9e4e0346995bc1d2deee12e457ab0e1f67e15ef4ed335ea57ade01b2513: Status 404 returned error can't find the container with id 5dffb9e4e0346995bc1d2deee12e457ab0e1f67e15ef4ed335ea57ade01b2513 Apr 22 19:42:51.638600 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:51.638551 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerStarted","Data":"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51"} Apr 22 19:42:51.639045 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:51.638638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerStarted","Data":"5dffb9e4e0346995bc1d2deee12e457ab0e1f67e15ef4ed335ea57ade01b2513"} Apr 22 19:42:52.646300 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:52.646261 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerID="622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51" exitCode=0 Apr 22 19:42:52.646791 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:52.646342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerDied","Data":"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51"} Apr 22 19:42:53.652795 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:53.652755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerStarted","Data":"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be"} Apr 22 19:42:53.652795 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:53.652801 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerStarted","Data":"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d"} Apr 22 19:42:53.653256 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:53.652903 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:42:53.677639 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:53.677594 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" podStartSLOduration=3.677578525 podStartE2EDuration="3.677578525s" podCreationTimestamp="2026-04-22 19:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:53.67495776 +0000 UTC m=+1123.459947633" watchObservedRunningTime="2026-04-22 19:42:53.677578525 +0000 UTC m=+1123.462568396" Apr 22 19:42:55.668661 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:55.668619 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerID="813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457" exitCode=0 Apr 22 19:42:55.669144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:55.668698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerDied","Data":"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457"} Apr 22 19:42:56.677315 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:56.677284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerStarted","Data":"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81"} Apr 22 19:42:56.700915 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:42:56.700865 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podStartSLOduration=6.700848946 podStartE2EDuration="6.700848946s" podCreationTimestamp="2026-04-22 19:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:56.69922041 +0000 UTC m=+1126.484210282" watchObservedRunningTime="2026-04-22 19:42:56.700848946 +0000 UTC m=+1126.485838817" Apr 22 19:43:00.345270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.345228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:43:00.345270 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.345268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:43:00.347017 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.346987 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:43:00.681727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.681617 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:43:00.681727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.681684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:43:00.684597 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.684565 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:43:00.692827 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:00.692805 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:43:10.345423 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:10.345376 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:43:20.345702 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:20.345656 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:43:22.700664 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:22.700635 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:43:30.345892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:30.345798 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:43:40.345401 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:40.345359 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:43:50.345623 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:43:50.345585 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:44:00.345707 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:00.345667 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:44:10.346138 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:10.346089 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:44:10.726018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:10.725930 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:44:10.726686 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:10.726660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:44:10.729710 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:10.729688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:44:10.730412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:10.730383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:44:20.346165 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:20.346119 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" probeResult="failure" output="Get \"https://10.134.0.44:8000/health\": dial tcp 10.134.0.44:8000: connect: connection refused" Apr 22 19:44:30.355479 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:30.355443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:44:30.363741 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:30.363717 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:44:47.482234 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:47.482200 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:44:47.482758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:47.482531 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="main" containerID="cri-o://00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d" gracePeriod=30 Apr 22 19:44:47.482758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:47.482582 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="tokenizer" containerID="cri-o://2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be" gracePeriod=30 Apr 22 19:44:47.490199 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:47.490170 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:44:47.490608 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:47.490579 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" containerID="cri-o://88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81" gracePeriod=30 Apr 22 19:44:48.080940 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.080904 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerID="00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d" exitCode=0 Apr 22 19:44:48.081140 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.080977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerDied","Data":"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d"} Apr 22 19:44:48.734613 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.734590 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:44:48.838227 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838191 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7k77\" (UniqueName: \"kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838437 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838295 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838437 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838333 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838437 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838365 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838444 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838468 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds\") pod \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\" (UID: \"8f5fc4c3-44dd-4bdf-9748-d334bf449842\") " Apr 22 19:44:48.838755 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:48.838802 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838782 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:48.838848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838813 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:48.838848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.838785 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:48.839185 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.839162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:48.840409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.840390 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77" (OuterVolumeSpecName: "kube-api-access-k7k77") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "kube-api-access-k7k77". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:44:48.840577 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.840556 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8f5fc4c3-44dd-4bdf-9748-d334bf449842" (UID: "8f5fc4c3-44dd-4bdf-9748-d334bf449842"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:44:48.940032 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.940004 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:48.940032 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.940032 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:48.940200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.940042 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:48.940200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.940052 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8f5fc4c3-44dd-4bdf-9748-d334bf449842-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:48.940200 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:48.940062 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k7k77\" (UniqueName: \"kubernetes.io/projected/8f5fc4c3-44dd-4bdf-9748-d334bf449842-kube-api-access-k7k77\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:44:49.092477 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.092384 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerID="2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be" exitCode=0 Apr 22 19:44:49.092477 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.092461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerDied","Data":"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be"} Apr 22 19:44:49.092733 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.092526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" event={"ID":"8f5fc4c3-44dd-4bdf-9748-d334bf449842","Type":"ContainerDied","Data":"5dffb9e4e0346995bc1d2deee12e457ab0e1f67e15ef4ed335ea57ade01b2513"} Apr 22 19:44:49.092733 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.092526 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552" Apr 22 19:44:49.092733 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.092543 2576 scope.go:117] "RemoveContainer" containerID="2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be" Apr 22 19:44:49.104150 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.103661 2576 scope.go:117] "RemoveContainer" containerID="00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d" Apr 22 19:44:49.112170 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.112150 2576 scope.go:117] "RemoveContainer" containerID="622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51" Apr 22 19:44:49.115642 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.115618 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:44:49.119538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.119517 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-5f9dd5d4f6-nl552"] Apr 22 19:44:49.120108 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120097 2576 scope.go:117] "RemoveContainer" containerID="2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be" Apr 22 19:44:49.120365 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:44:49.120347 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be\": container with ID starting with 2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be not found: ID does not exist" containerID="2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be" Apr 22 19:44:49.120408 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120389 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be"} err="failed to get container status \"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be\": rpc error: code = NotFound desc = could not find container \"2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be\": container with ID starting with 2ccc53b975b762e13149978722de5307c01246b24b326157279b71bfff63e7be not found: ID does not exist" Apr 22 19:44:49.120449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120407 2576 scope.go:117] "RemoveContainer" containerID="00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d" Apr 22 19:44:49.120683 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:44:49.120666 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d\": container with ID starting with 00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d not found: ID does not exist" containerID="00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d" Apr 22 19:44:49.120737 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120690 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d"} err="failed to get container status \"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d\": rpc error: code = NotFound desc = could not find container \"00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d\": container with ID starting with 00d86ad536d0525704cd16787abc3e51db853f6f139858bc882ab603052d8a2d not found: ID does not exist" Apr 22 19:44:49.120737 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120707 2576 scope.go:117] "RemoveContainer" containerID="622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51" Apr 22 19:44:49.120959 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:44:49.120941 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51\": container with ID starting with 622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51 not found: ID does not exist" containerID="622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51" Apr 22 19:44:49.121004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:49.120963 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51"} err="failed to get container status \"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51\": rpc error: code = NotFound desc = could not find container \"622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51\": container with ID starting with 622238112f1f54c83159e67995e9e268009da37b61ce5976ccf428b80696ee51 not found: ID does not exist" Apr 22 19:44:50.755563 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:44:50.755533 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" path="/var/lib/kubelet/pods/8f5fc4c3-44dd-4bdf-9748-d334bf449842/volumes" Apr 22 19:45:04.325541 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.325490 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.325973 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="tokenizer" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.325988 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="tokenizer" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326004 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="storage-initializer" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326010 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="storage-initializer" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326017 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="main" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326022 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="main" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326079 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="tokenizer" Apr 22 19:45:04.327936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.326087 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f5fc4c3-44dd-4bdf-9748-d334bf449842" containerName="main" Apr 22 19:45:04.329050 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.329031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.331655 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.331628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 22 19:45:04.338625 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.338592 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:45:04.380024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.379992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.380148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.380028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.380148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.380059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.380227 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.380165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js9lf\" (UniqueName: \"kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.380227 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.380208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.380293 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.380232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481545 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js9lf\" (UniqueName: \"kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481697 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481697 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481697 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.481847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.481804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.482098 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.482074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.482163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.482132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.482216 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.482183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.483902 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.483885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.484182 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.484165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.489661 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.489636 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js9lf\" (UniqueName: \"kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.640003 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.639915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:04.781925 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:04.781861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:45:04.784386 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:45:04.784340 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda18a641c_ece0_454e_b012_38bed347f69e.slice/crio-cda9122be11cee5775c5ac09b9c571508bb1271a6d52875cc3cdaf8c9daae319 WatchSource:0}: Error finding container cda9122be11cee5775c5ac09b9c571508bb1271a6d52875cc3cdaf8c9daae319: Status 404 returned error can't find the container with id cda9122be11cee5775c5ac09b9c571508bb1271a6d52875cc3cdaf8c9daae319 Apr 22 19:45:05.146764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:05.146728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerStarted","Data":"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2"} Apr 22 19:45:05.146764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:05.146770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerStarted","Data":"cda9122be11cee5775c5ac09b9c571508bb1271a6d52875cc3cdaf8c9daae319"} Apr 22 19:45:10.167372 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:10.167334 2576 generic.go:358] "Generic (PLEG): container finished" podID="a18a641c-ece0-454e-b012-38bed347f69e" containerID="e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2" exitCode=0 Apr 22 19:45:10.167841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:10.167407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerDied","Data":"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2"} Apr 22 19:45:11.172829 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:11.172796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerStarted","Data":"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23"} Apr 22 19:45:11.195168 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:11.195110 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podStartSLOduration=7.195095435 podStartE2EDuration="7.195095435s" podCreationTimestamp="2026-04-22 19:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:45:11.192665819 +0000 UTC m=+1260.977655690" watchObservedRunningTime="2026-04-22 19:45:11.195095435 +0000 UTC m=+1260.980085306" Apr 22 19:45:14.640874 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:14.640826 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:14.641274 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:14.640890 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:45:14.642064 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:14.642037 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:45:17.722134 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.722112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6688bd464-jnsh7_c8e26e47-fa1a-46d2-bb74-ba6ac35196be/main/0.log" Apr 22 19:45:17.722496 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.722453 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:45:17.814639 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814610 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.814836 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.814836 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814679 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxpbq\" (UniqueName: \"kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.814836 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814758 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.814836 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814828 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.815047 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location\") pod \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\" (UID: \"c8e26e47-fa1a-46d2-bb74-ba6ac35196be\") " Apr 22 19:45:17.815047 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.814885 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache" (OuterVolumeSpecName: "model-cache") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:17.815212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.815192 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:17.815289 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.815191 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home" (OuterVolumeSpecName: "home") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:17.816776 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.816745 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm" (OuterVolumeSpecName: "dshm") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:17.817048 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.817021 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:45:17.817291 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.817269 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq" (OuterVolumeSpecName: "kube-api-access-dxpbq") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "kube-api-access-dxpbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:45:17.872487 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.872443 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c8e26e47-fa1a-46d2-bb74-ba6ac35196be" (UID: "c8e26e47-fa1a-46d2-bb74-ba6ac35196be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:45:17.916153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.916117 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:17.916153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.916149 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:17.916153 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.916160 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:17.916369 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.916168 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:17.916369 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:17.916179 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dxpbq\" (UniqueName: \"kubernetes.io/projected/c8e26e47-fa1a-46d2-bb74-ba6ac35196be-kube-api-access-dxpbq\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:45:18.202188 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202116 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-6688bd464-jnsh7_c8e26e47-fa1a-46d2-bb74-ba6ac35196be/main/0.log" Apr 22 19:45:18.202495 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202466 2576 generic.go:358] "Generic (PLEG): container finished" podID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerID="88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81" exitCode=137 Apr 22 19:45:18.202595 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerDied","Data":"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81"} Apr 22 19:45:18.202595 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202566 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" Apr 22 19:45:18.202595 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7" event={"ID":"c8e26e47-fa1a-46d2-bb74-ba6ac35196be","Type":"ContainerDied","Data":"0d90ec9bb26a11a8a7fd1445ebd13207b89bb9772f7fc2601e735aa26649b3de"} Apr 22 19:45:18.202714 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.202603 2576 scope.go:117] "RemoveContainer" containerID="88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81" Apr 22 19:45:18.221609 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.221588 2576 scope.go:117] "RemoveContainer" containerID="813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457" Apr 22 19:45:18.227925 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.227900 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:45:18.239525 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.234226 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-6688bd464-jnsh7"] Apr 22 19:45:18.291798 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.291774 2576 scope.go:117] "RemoveContainer" containerID="88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81" Apr 22 19:45:18.292144 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:45:18.292122 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81\": container with ID starting with 88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81 not found: ID does not exist" containerID="88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81" Apr 22 19:45:18.292238 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.292155 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81"} err="failed to get container status \"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81\": rpc error: code = NotFound desc = could not find container \"88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81\": container with ID starting with 88c8b967fa7d82840a2a652a2ecd7841fb6386b665265fa991f3fc050ba24b81 not found: ID does not exist" Apr 22 19:45:18.292238 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.292178 2576 scope.go:117] "RemoveContainer" containerID="813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457" Apr 22 19:45:18.292411 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:45:18.292395 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457\": container with ID starting with 813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457 not found: ID does not exist" containerID="813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457" Apr 22 19:45:18.292457 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.292414 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457"} err="failed to get container status \"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457\": rpc error: code = NotFound desc = could not find container \"813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457\": container with ID starting with 813054ea78765a5c97176b23ad74b6d81c1b501b3289bdcc840457e79fd27457 not found: ID does not exist" Apr 22 19:45:18.755512 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:18.755474 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" path="/var/lib/kubelet/pods/c8e26e47-fa1a-46d2-bb74-ba6ac35196be/volumes" Apr 22 19:45:24.641020 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:24.640964 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:45:34.640835 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:34.640782 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:45:44.640421 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:44.640374 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:45:54.640953 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:45:54.640902 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:46:04.641049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:04.640998 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:46:08.165197 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.164893 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:46:08.165694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.165437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="storage-initializer" Apr 22 19:46:08.165694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.165457 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="storage-initializer" Apr 22 19:46:08.165694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.165482 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" Apr 22 19:46:08.165694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.165491 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" Apr 22 19:46:08.165694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.165674 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8e26e47-fa1a-46d2-bb74-ba6ac35196be" containerName="main" Apr 22 19:46:08.167717 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.167695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.170101 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.170076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 19:46:08.178415 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.178391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:46:08.256266 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.256458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtwdj\" (UniqueName: \"kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.256458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.256458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.256659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.256659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.256578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.357954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.357904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtwdj\" (UniqueName: \"kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.357975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358335 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358335 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358517 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358648 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.358648 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.358556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.360480 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.360454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.360766 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.360744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.367095 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.367067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtwdj\" (UniqueName: \"kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.450983 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.450906 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:46:08.453262 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.453246 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.455628 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.455603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-79bnj\"" Apr 22 19:46:08.463766 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.463743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:46:08.480190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.480162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:08.560328 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8fh8\" (UniqueName: \"kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.560467 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.560467 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.560623 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.560623 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.560623 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.560618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.661849 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.661814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.661849 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.661858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.661876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.661930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8fh8\" (UniqueName: \"kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.661952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.662308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.662328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662360 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.662341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.662370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.662590 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.662574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.664389 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.664363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.674211 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.674186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8fh8\" (UniqueName: \"kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.763789 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.763704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:08.831098 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.831064 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:46:08.832850 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:46:08.832818 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90592f52_d190_4a9c_b724_26fa09066937.slice/crio-3e26c372fd6bb3eb22c3584dfb1f3fa7ec6be826dc074a09f3f8a5b88303beda WatchSource:0}: Error finding container 3e26c372fd6bb3eb22c3584dfb1f3fa7ec6be826dc074a09f3f8a5b88303beda: Status 404 returned error can't find the container with id 3e26c372fd6bb3eb22c3584dfb1f3fa7ec6be826dc074a09f3f8a5b88303beda Apr 22 19:46:08.834842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.834824 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:46:08.905554 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:08.905531 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:46:08.908397 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:46:08.908370 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024ee7dc_6cea_40b2_ab70_37dc1568424e.slice/crio-d23e2b1164e02bb280060022e2217e8e282ef0a0174450ad00770f928c145e52 WatchSource:0}: Error finding container d23e2b1164e02bb280060022e2217e8e282ef0a0174450ad00770f928c145e52: Status 404 returned error can't find the container with id d23e2b1164e02bb280060022e2217e8e282ef0a0174450ad00770f928c145e52 Apr 22 19:46:09.386289 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:09.386253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerStarted","Data":"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c"} Apr 22 19:46:09.386756 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:09.386568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerStarted","Data":"d23e2b1164e02bb280060022e2217e8e282ef0a0174450ad00770f928c145e52"} Apr 22 19:46:09.388119 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:09.388088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerStarted","Data":"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887"} Apr 22 19:46:09.388277 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:09.388124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerStarted","Data":"3e26c372fd6bb3eb22c3584dfb1f3fa7ec6be826dc074a09f3f8a5b88303beda"} Apr 22 19:46:10.394420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:10.394373 2576 generic.go:358] "Generic (PLEG): container finished" podID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerID="a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c" exitCode=0 Apr 22 19:46:10.394954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:10.394453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerDied","Data":"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c"} Apr 22 19:46:11.401157 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:11.401113 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerStarted","Data":"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164"} Apr 22 19:46:11.401157 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:11.401162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerStarted","Data":"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5"} Apr 22 19:46:11.401769 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:11.401215 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:11.425125 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:11.425064 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" podStartSLOduration=3.425047008 podStartE2EDuration="3.425047008s" podCreationTimestamp="2026-04-22 19:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:11.423076387 +0000 UTC m=+1321.208066260" watchObservedRunningTime="2026-04-22 19:46:11.425047008 +0000 UTC m=+1321.210036881" Apr 22 19:46:13.412924 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:13.412836 2576 generic.go:358] "Generic (PLEG): container finished" podID="90592f52-d190-4a9c-b724-26fa09066937" containerID="94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887" exitCode=0 Apr 22 19:46:13.413303 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:13.412916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerDied","Data":"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887"} Apr 22 19:46:14.419959 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:14.419911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerStarted","Data":"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e"} Apr 22 19:46:14.442570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:14.442495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podStartSLOduration=6.442480816 podStartE2EDuration="6.442480816s" podCreationTimestamp="2026-04-22 19:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:14.441811946 +0000 UTC m=+1324.226801818" watchObservedRunningTime="2026-04-22 19:46:14.442480816 +0000 UTC m=+1324.227470687" Apr 22 19:46:14.641247 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:14.641187 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:46:18.480665 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.480624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:18.481066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.480802 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:46:18.482415 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.482390 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:46:18.763975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.763884 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:18.763975 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.763934 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:18.766919 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:18.766894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:19.441280 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:19.441250 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:24.641463 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:24.641361 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:46:28.481022 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:28.480975 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:46:34.640443 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:34.640399 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 22 19:46:38.480901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:38.480855 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:46:40.446903 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:40.446870 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:46:44.651084 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:44.651046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:46:44.659531 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:44.659489 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:46:48.480989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:48.480934 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:46:48.985953 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:48.985913 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:46:48.986319 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:48.986288 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" containerID="cri-o://ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23" gracePeriod=30 Apr 22 19:46:58.481295 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:46:58.481247 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:47:00.087155 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.087119 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:47:00.092338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.092315 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.094901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.094873 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-kvsxc\"" Apr 22 19:47:00.094901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.094887 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 19:47:00.098603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.098580 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:47:00.172713 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.172675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.172874 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.172790 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5d6\" (UniqueName: \"kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.172874 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.172851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.172968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.172882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.172968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.172937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.173070 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.173010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274366 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274432 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5d6\" (UniqueName: \"kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274588 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.274927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.274868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.276727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.276696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.276883 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.276866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.284534 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.284480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5d6\" (UniqueName: \"kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.405035 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.404957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:47:00.558618 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.558586 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:47:00.562428 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:47:00.562396 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41637b2e_b65a_4995_9476_c1cf0502ce40.slice/crio-8e7e99579334b34291d753be880c63957d31c66cd011e6d838fe484222681c7b WatchSource:0}: Error finding container 8e7e99579334b34291d753be880c63957d31c66cd011e6d838fe484222681c7b: Status 404 returned error can't find the container with id 8e7e99579334b34291d753be880c63957d31c66cd011e6d838fe484222681c7b Apr 22 19:47:00.597959 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:00.597929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerStarted","Data":"8e7e99579334b34291d753be880c63957d31c66cd011e6d838fe484222681c7b"} Apr 22 19:47:01.603157 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:01.603119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerStarted","Data":"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f"} Apr 22 19:47:05.622781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:05.622743 2576 generic.go:358] "Generic (PLEG): container finished" podID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerID="3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f" exitCode=0 Apr 22 19:47:05.623173 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:05.622827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerDied","Data":"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f"} Apr 22 19:47:06.630401 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:06.630362 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerStarted","Data":"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521"} Apr 22 19:47:06.652493 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:06.652428 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.652407212 podStartE2EDuration="6.652407212s" podCreationTimestamp="2026-04-22 19:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:47:06.651327305 +0000 UTC m=+1376.436317190" watchObservedRunningTime="2026-04-22 19:47:06.652407212 +0000 UTC m=+1376.437397088" Apr 22 19:47:08.481256 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:08.481206 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:47:18.481727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:18.481681 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:47:19.260953 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.260922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs_a18a641c-ece0-454e-b012-38bed347f69e/main/0.log" Apr 22 19:47:19.261332 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.261312 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:47:19.369857 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.369814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.369904 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.369950 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.369989 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.370039 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.370088 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js9lf\" (UniqueName: \"kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf\") pod \"a18a641c-ece0-454e-b012-38bed347f69e\" (UID: \"a18a641c-ece0-454e-b012-38bed347f69e\") " Apr 22 19:47:19.370531 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.370465 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home" (OuterVolumeSpecName: "home") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:19.370759 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.370727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache" (OuterVolumeSpecName: "model-cache") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:19.372253 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.372220 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:47:19.372486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.372459 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm" (OuterVolumeSpecName: "dshm") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:19.372628 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.372525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf" (OuterVolumeSpecName: "kube-api-access-js9lf") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "kube-api-access-js9lf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:47:19.426241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.426198 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a18a641c-ece0-454e-b012-38bed347f69e" (UID: "a18a641c-ece0-454e-b012-38bed347f69e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:19.471633 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471594 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.471633 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471622 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a18a641c-ece0-454e-b012-38bed347f69e-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.471633 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471634 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.471633 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471644 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.472000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471654 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-js9lf\" (UniqueName: \"kubernetes.io/projected/a18a641c-ece0-454e-b012-38bed347f69e-kube-api-access-js9lf\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.472000 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.471663 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a18a641c-ece0-454e-b012-38bed347f69e-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:47:19.686337 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686302 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs_a18a641c-ece0-454e-b012-38bed347f69e/main/0.log" Apr 22 19:47:19.686851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686706 2576 generic.go:358] "Generic (PLEG): container finished" podID="a18a641c-ece0-454e-b012-38bed347f69e" containerID="ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23" exitCode=137 Apr 22 19:47:19.686851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686805 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" Apr 22 19:47:19.686851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerDied","Data":"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23"} Apr 22 19:47:19.686851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs" event={"ID":"a18a641c-ece0-454e-b012-38bed347f69e","Type":"ContainerDied","Data":"cda9122be11cee5775c5ac09b9c571508bb1271a6d52875cc3cdaf8c9daae319"} Apr 22 19:47:19.687095 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.686868 2576 scope.go:117] "RemoveContainer" containerID="ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23" Apr 22 19:47:19.706645 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.706624 2576 scope.go:117] "RemoveContainer" containerID="e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2" Apr 22 19:47:19.714598 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.714567 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:47:19.718226 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.718201 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-7d474f68b6kmffs"] Apr 22 19:47:19.762214 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.762189 2576 scope.go:117] "RemoveContainer" containerID="ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23" Apr 22 19:47:19.762555 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:47:19.762530 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23\": container with ID starting with ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23 not found: ID does not exist" containerID="ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23" Apr 22 19:47:19.762652 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.762570 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23"} err="failed to get container status \"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23\": rpc error: code = NotFound desc = could not find container \"ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23\": container with ID starting with ac0fe0841aec8ecc5e1158eab2368bfbabfc1ec8e2d226f9540f0d149c01ae23 not found: ID does not exist" Apr 22 19:47:19.762652 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.762599 2576 scope.go:117] "RemoveContainer" containerID="e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2" Apr 22 19:47:19.762936 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:47:19.762907 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2\": container with ID starting with e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2 not found: ID does not exist" containerID="e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2" Apr 22 19:47:19.763056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:19.762938 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2"} err="failed to get container status \"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2\": rpc error: code = NotFound desc = could not find container \"e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2\": container with ID starting with e4c0058042292dd5e8f45666e52ee63ec781a2ab4ada18e22871d173d1a3cde2 not found: ID does not exist" Apr 22 19:47:20.755029 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:20.754982 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18a641c-ece0-454e-b012-38bed347f69e" path="/var/lib/kubelet/pods/a18a641c-ece0-454e-b012-38bed347f69e/volumes" Apr 22 19:47:28.481266 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:28.481215 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:47:38.480792 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:38.480743 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 22 19:47:48.490574 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:48.490532 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:47:48.498120 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:47:48.498092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:48:16.077632 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.077598 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:48:16.078152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.077902 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="main" containerID="cri-o://30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5" gracePeriod=30 Apr 22 19:48:16.078152 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.078003 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="tokenizer" containerID="cri-o://146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164" gracePeriod=30 Apr 22 19:48:16.099334 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.099302 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:48:16.100056 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.099714 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" containerID="cri-o://9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e" gracePeriod=30 Apr 22 19:48:16.916691 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.916659 2576 generic.go:358] "Generic (PLEG): container finished" podID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerID="30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5" exitCode=0 Apr 22 19:48:16.916894 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:16.916734 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerDied","Data":"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5"} Apr 22 19:48:17.329601 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.329574 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:48:17.426934 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.426893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.426944 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.426992 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8fh8\" (UniqueName: \"kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427057 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427121 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs\") pod \"024ee7dc-6cea-40b2-ab70-37dc1568424e\" (UID: \"024ee7dc-6cea-40b2-ab70-37dc1568424e\") " Apr 22 19:48:17.427399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427315 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:17.427399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427355 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:17.427529 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427407 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:17.427529 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427418 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.427742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.427717 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:17.429254 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.429229 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8" (OuterVolumeSpecName: "kube-api-access-q8fh8") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "kube-api-access-q8fh8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:17.429420 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.429403 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "024ee7dc-6cea-40b2-ab70-37dc1568424e" (UID: "024ee7dc-6cea-40b2-ab70-37dc1568424e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:17.528330 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.528287 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8fh8\" (UniqueName: \"kubernetes.io/projected/024ee7dc-6cea-40b2-ab70-37dc1568424e-kube-api-access-q8fh8\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.528330 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.528317 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.528330 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.528328 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.528330 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.528339 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/024ee7dc-6cea-40b2-ab70-37dc1568424e-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.528746 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.528369 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/024ee7dc-6cea-40b2-ab70-37dc1568424e-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:17.921781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.921684 2576 generic.go:358] "Generic (PLEG): container finished" podID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerID="146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164" exitCode=0 Apr 22 19:48:17.921781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.921740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerDied","Data":"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164"} Apr 22 19:48:17.921781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.921757 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" Apr 22 19:48:17.921781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.921767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2" event={"ID":"024ee7dc-6cea-40b2-ab70-37dc1568424e","Type":"ContainerDied","Data":"d23e2b1164e02bb280060022e2217e8e282ef0a0174450ad00770f928c145e52"} Apr 22 19:48:17.921781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.921783 2576 scope.go:117] "RemoveContainer" containerID="146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164" Apr 22 19:48:17.931933 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.931908 2576 scope.go:117] "RemoveContainer" containerID="30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5" Apr 22 19:48:17.941212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.941195 2576 scope.go:117] "RemoveContainer" containerID="a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c" Apr 22 19:48:17.949271 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.949245 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:48:17.949978 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.949927 2576 scope.go:117] "RemoveContainer" containerID="146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164" Apr 22 19:48:17.950320 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:48:17.950286 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164\": container with ID starting with 146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164 not found: ID does not exist" containerID="146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164" Apr 22 19:48:17.950436 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.950322 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164"} err="failed to get container status \"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164\": rpc error: code = NotFound desc = could not find container \"146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164\": container with ID starting with 146fe8cecec8105965025bf4000249f53d9d31f486b4cb65c0b1937dc7204164 not found: ID does not exist" Apr 22 19:48:17.950436 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.950349 2576 scope.go:117] "RemoveContainer" containerID="30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5" Apr 22 19:48:17.950640 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:48:17.950618 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5\": container with ID starting with 30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5 not found: ID does not exist" containerID="30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5" Apr 22 19:48:17.950726 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.950646 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5"} err="failed to get container status \"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5\": rpc error: code = NotFound desc = could not find container \"30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5\": container with ID starting with 30ac6cc17e9e7673ddac11023365260c8a5887256b59e2f2a2e54cf3ff428be5 not found: ID does not exist" Apr 22 19:48:17.950726 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.950662 2576 scope.go:117] "RemoveContainer" containerID="a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c" Apr 22 19:48:17.950883 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:48:17.950862 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c\": container with ID starting with a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c not found: ID does not exist" containerID="a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c" Apr 22 19:48:17.950945 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.950887 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c"} err="failed to get container status \"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c\": rpc error: code = NotFound desc = could not find container \"a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c\": container with ID starting with a0d77927c41d01bcae90773b3fa85499b0a5e5a73d7db8e8d1b250acfaa6dc6c not found: ID does not exist" Apr 22 19:48:17.952172 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:17.952151 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche8ttr2"] Apr 22 19:48:18.756160 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:18.756125 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" path="/var/lib/kubelet/pods/024ee7dc-6cea-40b2-ab70-37dc1568424e/volumes" Apr 22 19:48:23.384909 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.384876 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385259 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="storage-initializer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385272 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="storage-initializer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385287 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="storage-initializer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385294 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="storage-initializer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385309 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385317 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385332 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="tokenizer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385337 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="tokenizer" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385352 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="main" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385358 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="main" Apr 22 19:48:23.385417 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385426 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a18a641c-ece0-454e-b012-38bed347f69e" containerName="main" Apr 22 19:48:23.385989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385437 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="main" Apr 22 19:48:23.385989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.385450 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="024ee7dc-6cea-40b2-ab70-37dc1568424e" containerName="tokenizer" Apr 22 19:48:23.390785 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.390763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.393883 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.393858 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 19:48:23.395143 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.395124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-rs5gx\"" Apr 22 19:48:23.424840 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.424810 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:48:23.488077 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488047 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrhb\" (UniqueName: \"kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.488241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.488241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.488241 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.488349 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.488349 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.488264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.589694 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrhb\" (UniqueName: \"kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.589864 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.589864 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.589864 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.589864 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.590084 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.589880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.590310 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.590273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.590418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.590288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.590418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.590354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.592024 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.591996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.592550 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.592529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.604104 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.604065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrhb\" (UniqueName: \"kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb\") pod \"custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.701733 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.701640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:23.758853 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.758814 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:48:23.766295 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.765645 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.769859 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.769568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-6w77f\"" Apr 22 19:48:23.775066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.775042 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:48:23.793202 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.793327 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.793394 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.793394 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.793538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpp4w\" (UniqueName: \"kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.793538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.793524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.894890 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.894854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpp4w\" (UniqueName: \"kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.894910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.894941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.894971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895086 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895362 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895416 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895463 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.895463 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.895458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.897442 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.897420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:23.904001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:23.903978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpp4w\" (UniqueName: \"kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:24.054706 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.054675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:48:24.056258 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:48:24.056221 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a258a7_24b8_4431_9958_dc5164f90c1f.slice/crio-82f827dba297ac42477dbfd52b65c18c0bd0fc7562ae6ef6b2d43a7c2ae1b906 WatchSource:0}: Error finding container 82f827dba297ac42477dbfd52b65c18c0bd0fc7562ae6ef6b2d43a7c2ae1b906: Status 404 returned error can't find the container with id 82f827dba297ac42477dbfd52b65c18c0bd0fc7562ae6ef6b2d43a7c2ae1b906 Apr 22 19:48:24.083715 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.083689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:24.217001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.216819 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:48:24.219849 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:48:24.219819 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72cb9b44_37dc_4f18_814f_30ba035d0569.slice/crio-229e1befe974af5db7eef2049c066cd18d24e89be0c0258975e4888b6567fc36 WatchSource:0}: Error finding container 229e1befe974af5db7eef2049c066cd18d24e89be0c0258975e4888b6567fc36: Status 404 returned error can't find the container with id 229e1befe974af5db7eef2049c066cd18d24e89be0c0258975e4888b6567fc36 Apr 22 19:48:24.951604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.951562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerStarted","Data":"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7"} Apr 22 19:48:24.951604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.951604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerStarted","Data":"229e1befe974af5db7eef2049c066cd18d24e89be0c0258975e4888b6567fc36"} Apr 22 19:48:24.953514 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:24.953476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerStarted","Data":"82f827dba297ac42477dbfd52b65c18c0bd0fc7562ae6ef6b2d43a7c2ae1b906"} Apr 22 19:48:25.958663 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:25.958623 2576 generic.go:358] "Generic (PLEG): container finished" podID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerID="7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7" exitCode=0 Apr 22 19:48:25.959141 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:25.958706 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerDied","Data":"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7"} Apr 22 19:48:25.960253 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:25.960215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerStarted","Data":"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8"} Apr 22 19:48:25.960386 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:25.960319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:26.968066 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:26.968016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerStarted","Data":"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8"} Apr 22 19:48:26.968545 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:26.968074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerStarted","Data":"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b"} Apr 22 19:48:26.968545 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:26.968296 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:26.970068 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:26.970035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerStarted","Data":"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c"} Apr 22 19:48:26.992162 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:26.992094 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" podStartSLOduration=3.992076178 podStartE2EDuration="3.992076178s" podCreationTimestamp="2026-04-22 19:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:26.99107437 +0000 UTC m=+1456.776064256" watchObservedRunningTime="2026-04-22 19:48:26.992076178 +0000 UTC m=+1456.777066055" Apr 22 19:48:30.986220 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:30.986122 2576 generic.go:358] "Generic (PLEG): container finished" podID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerID="4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c" exitCode=0 Apr 22 19:48:30.986220 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:30.986204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerDied","Data":"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c"} Apr 22 19:48:31.992143 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:31.992099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerStarted","Data":"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c"} Apr 22 19:48:32.023669 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:32.023611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podStartSLOduration=8.182354341 podStartE2EDuration="9.02359326s" podCreationTimestamp="2026-04-22 19:48:23 +0000 UTC" firstStartedPulling="2026-04-22 19:48:24.058104748 +0000 UTC m=+1453.843094598" lastFinishedPulling="2026-04-22 19:48:24.899343663 +0000 UTC m=+1454.684333517" observedRunningTime="2026-04-22 19:48:32.019925282 +0000 UTC m=+1461.804915156" watchObservedRunningTime="2026-04-22 19:48:32.02359326 +0000 UTC m=+1461.808583131" Apr 22 19:48:33.702606 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:33.702559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:33.703007 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:33.702618 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:33.703912 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:33.703874 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:48:34.084025 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:34.083978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:34.084025 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:34.084034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:34.087351 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:34.087324 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:35.006641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:35.006612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:48:43.702041 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:43.702000 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:48:43.715594 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:43.715562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:48:46.422959 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.422933 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:48:46.521685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521652 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.521870 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.521870 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521778 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.521870 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.522057 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521872 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtwdj\" (UniqueName: \"kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.522057 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.521921 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location\") pod \"90592f52-d190-4a9c-b724-26fa09066937\" (UID: \"90592f52-d190-4a9c-b724-26fa09066937\") " Apr 22 19:48:46.522174 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.522120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache" (OuterVolumeSpecName: "model-cache") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.522225 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.522168 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home" (OuterVolumeSpecName: "home") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.522345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.522324 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.522409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.522348 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.524009 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.523985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm" (OuterVolumeSpecName: "dshm") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.524292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.524271 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:46.524405 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.524384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj" (OuterVolumeSpecName: "kube-api-access-jtwdj") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "kube-api-access-jtwdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:46.534356 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.534332 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90592f52-d190-4a9c-b724-26fa09066937" (UID: "90592f52-d190-4a9c-b724-26fa09066937"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:46.623177 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.623143 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtwdj\" (UniqueName: \"kubernetes.io/projected/90592f52-d190-4a9c-b724-26fa09066937-kube-api-access-jtwdj\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.623177 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.623171 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.623177 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.623181 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90592f52-d190-4a9c-b724-26fa09066937-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:46.623409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:46.623193 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90592f52-d190-4a9c-b724-26fa09066937-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:48:47.059391 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.059359 2576 generic.go:358] "Generic (PLEG): container finished" podID="90592f52-d190-4a9c-b724-26fa09066937" containerID="9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e" exitCode=137 Apr 22 19:48:47.059603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.059448 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" Apr 22 19:48:47.059603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.059445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerDied","Data":"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e"} Apr 22 19:48:47.059603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.059558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s" event={"ID":"90592f52-d190-4a9c-b724-26fa09066937","Type":"ContainerDied","Data":"3e26c372fd6bb3eb22c3584dfb1f3fa7ec6be826dc074a09f3f8a5b88303beda"} Apr 22 19:48:47.059603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.059574 2576 scope.go:117] "RemoveContainer" containerID="9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e" Apr 22 19:48:47.079760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.079733 2576 scope.go:117] "RemoveContainer" containerID="94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887" Apr 22 19:48:47.099433 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.099410 2576 scope.go:117] "RemoveContainer" containerID="9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e" Apr 22 19:48:47.099770 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:48:47.099743 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e\": container with ID starting with 9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e not found: ID does not exist" containerID="9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e" Apr 22 19:48:47.099833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.099780 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e"} err="failed to get container status \"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e\": rpc error: code = NotFound desc = could not find container \"9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e\": container with ID starting with 9f812d897e7d59ed9a0f28899ef53a5dcf3a954f84acaea0f839a960f186767e not found: ID does not exist" Apr 22 19:48:47.099833 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.099800 2576 scope.go:117] "RemoveContainer" containerID="94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887" Apr 22 19:48:47.100051 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:48:47.100035 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887\": container with ID starting with 94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887 not found: ID does not exist" containerID="94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887" Apr 22 19:48:47.100108 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.100053 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887"} err="failed to get container status \"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887\": rpc error: code = NotFound desc = could not find container \"94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887\": container with ID starting with 94b45826b00375e672be06c1ff726a95f24997db0eccaec42ded4c6536558887 not found: ID does not exist" Apr 22 19:48:47.118486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.118454 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:48:47.126735 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:47.126710 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-77vck8s"] Apr 22 19:48:48.757145 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:48.757103 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90592f52-d190-4a9c-b724-26fa09066937" path="/var/lib/kubelet/pods/90592f52-d190-4a9c-b724-26fa09066937/volumes" Apr 22 19:48:53.702212 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:53.702159 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:48:56.012155 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:48:56.012127 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:49:00.049379 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.049341 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:49:00.049790 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.049627 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="main" containerID="cri-o://cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521" gracePeriod=30 Apr 22 19:49:00.814724 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.814701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:49:00.864523 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864448 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5d6\" (UniqueName: \"kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864527 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864685 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864622 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864682 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864734 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location\") pod \"41637b2e-b65a-4995-9476-c1cf0502ce40\" (UID: \"41637b2e-b65a-4995-9476-c1cf0502ce40\") " Apr 22 19:49:00.864964 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.864895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache" (OuterVolumeSpecName: "model-cache") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.865144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.865070 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.865144 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.865071 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home" (OuterVolumeSpecName: "home") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.866926 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.866895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm" (OuterVolumeSpecName: "dshm") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.867020 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.866951 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6" (OuterVolumeSpecName: "kube-api-access-jf5d6") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "kube-api-access-jf5d6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:49:00.867255 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.867228 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:49:00.921764 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.921689 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41637b2e-b65a-4995-9476-c1cf0502ce40" (UID: "41637b2e-b65a-4995-9476-c1cf0502ce40"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:49:00.966519 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.966484 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.966647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.966539 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jf5d6\" (UniqueName: \"kubernetes.io/projected/41637b2e-b65a-4995-9476-c1cf0502ce40-kube-api-access-jf5d6\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.966647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.966557 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41637b2e-b65a-4995-9476-c1cf0502ce40-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.966647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.966570 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:00.966647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:00.966577 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41637b2e-b65a-4995-9476-c1cf0502ce40-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:49:01.116411 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.116380 2576 generic.go:358] "Generic (PLEG): container finished" podID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerID="cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521" exitCode=0 Apr 22 19:49:01.116848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.116439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerDied","Data":"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521"} Apr 22 19:49:01.116848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.116468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"41637b2e-b65a-4995-9476-c1cf0502ce40","Type":"ContainerDied","Data":"8e7e99579334b34291d753be880c63957d31c66cd011e6d838fe484222681c7b"} Apr 22 19:49:01.116848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.116483 2576 scope.go:117] "RemoveContainer" containerID="cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521" Apr 22 19:49:01.116848 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.116441 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 19:49:01.136223 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.136206 2576 scope.go:117] "RemoveContainer" containerID="3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f" Apr 22 19:49:01.140990 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.140963 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:49:01.147083 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.147062 2576 scope.go:117] "RemoveContainer" containerID="cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521" Apr 22 19:49:01.147370 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:49:01.147346 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521\": container with ID starting with cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521 not found: ID does not exist" containerID="cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521" Apr 22 19:49:01.147477 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.147380 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521"} err="failed to get container status \"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521\": rpc error: code = NotFound desc = could not find container \"cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521\": container with ID starting with cefbdc6dae310bcebf78e73953af40afc3c7acb1de2fff4e2e864b6bf8d45521 not found: ID does not exist" Apr 22 19:49:01.147477 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.147406 2576 scope.go:117] "RemoveContainer" containerID="3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f" Apr 22 19:49:01.147766 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:49:01.147743 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f\": container with ID starting with 3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f not found: ID does not exist" containerID="3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f" Apr 22 19:49:01.147842 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.147777 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f"} err="failed to get container status \"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f\": rpc error: code = NotFound desc = could not find container \"3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f\": container with ID starting with 3624d79c30b0beb68168283ca613a88cfa515391ed28f727e43882a59584e22f not found: ID does not exist" Apr 22 19:49:01.148885 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:01.148866 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 19:49:02.755571 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:02.755539 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" path="/var/lib/kubelet/pods/41637b2e-b65a-4995-9476-c1cf0502ce40/volumes" Apr 22 19:49:03.702603 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:03.702564 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:49:10.756586 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:10.756558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:49:10.758254 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:10.758225 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:49:10.760799 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:10.760774 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:49:10.762453 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:10.762433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:49:13.702729 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:13.702680 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:49:23.702691 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:23.702571 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:49:33.702821 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:33.702770 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:49:43.703027 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:43.702972 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:49:53.702844 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:49:53.702794 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8001/health\": dial tcp 10.134.0.50:8001: connect: connection refused" Apr 22 19:50:03.717279 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:03.717242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:50:03.729705 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:03.729684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:50:25.640676 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:25.640637 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:50:25.643392 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:25.641057 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="main" containerID="cri-o://6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b" gracePeriod=30 Apr 22 19:50:25.643392 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:25.641147 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="tokenizer" containerID="cri-o://c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8" gracePeriod=30 Apr 22 19:50:25.643392 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:25.643359 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:50:25.643945 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:25.643756 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" containerID="cri-o://7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" gracePeriod=30 Apr 22 19:50:26.010851 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:50:26.010768 2576 logging.go:55] [core] [Channel #284 SubChannel #285]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.51:9003", ServerName: "10.134.0.51:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.51:9003: connect: connection refused" Apr 22 19:50:26.442634 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.442597 2576 generic.go:358] "Generic (PLEG): container finished" podID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerID="6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b" exitCode=0 Apr 22 19:50:26.442800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.442669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerDied","Data":"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b"} Apr 22 19:50:26.908076 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.908045 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:50:26.988989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.988956 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpp4w\" (UniqueName: \"kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989148 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989120 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989247 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989185 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989247 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989231 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989325 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache\") pod \"72cb9b44-37dc-4f18-814f-30ba035d0569\" (UID: \"72cb9b44-37dc-4f18-814f-30ba035d0569\") " Apr 22 19:50:26.989476 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989442 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:26.989641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989543 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:26.989641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989615 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:26.989727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989683 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:26.989727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989695 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:26.989727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989704 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:26.989962 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.989941 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:26.991632 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.991609 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w" (OuterVolumeSpecName: "kube-api-access-fpp4w") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "kube-api-access-fpp4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:50:26.991709 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:26.991630 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "72cb9b44-37dc-4f18-814f-30ba035d0569" (UID: "72cb9b44-37dc-4f18-814f-30ba035d0569"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:50:27.010899 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.010825 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.51:9003\" within 1s: context deadline exceeded" Apr 22 19:50:27.090564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.090521 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72cb9b44-37dc-4f18-814f-30ba035d0569-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:27.090564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.090560 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpp4w\" (UniqueName: \"kubernetes.io/projected/72cb9b44-37dc-4f18-814f-30ba035d0569-kube-api-access-fpp4w\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:27.090564 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.090572 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72cb9b44-37dc-4f18-814f-30ba035d0569-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:27.448803 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.448770 2576 generic.go:358] "Generic (PLEG): container finished" podID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerID="c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8" exitCode=0 Apr 22 19:50:27.449004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.448840 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" Apr 22 19:50:27.449004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.448854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerDied","Data":"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8"} Apr 22 19:50:27.449004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.448900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs" event={"ID":"72cb9b44-37dc-4f18-814f-30ba035d0569","Type":"ContainerDied","Data":"229e1befe974af5db7eef2049c066cd18d24e89be0c0258975e4888b6567fc36"} Apr 22 19:50:27.449004 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.448922 2576 scope.go:117] "RemoveContainer" containerID="c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8" Apr 22 19:50:27.457687 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.457671 2576 scope.go:117] "RemoveContainer" containerID="6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b" Apr 22 19:50:27.464984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.464961 2576 scope.go:117] "RemoveContainer" containerID="7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7" Apr 22 19:50:27.472013 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.471998 2576 scope.go:117] "RemoveContainer" containerID="c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8" Apr 22 19:50:27.472240 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:27.472223 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8\": container with ID starting with c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8 not found: ID does not exist" containerID="c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8" Apr 22 19:50:27.472284 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.472247 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8"} err="failed to get container status \"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8\": rpc error: code = NotFound desc = could not find container \"c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8\": container with ID starting with c68d128f40f4c74da35fc1958701c911765df53f1819cb94b33830901f1f41c8 not found: ID does not exist" Apr 22 19:50:27.472284 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.472264 2576 scope.go:117] "RemoveContainer" containerID="6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b" Apr 22 19:50:27.472526 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:27.472480 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b\": container with ID starting with 6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b not found: ID does not exist" containerID="6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b" Apr 22 19:50:27.472526 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.472520 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b"} err="failed to get container status \"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b\": rpc error: code = NotFound desc = could not find container \"6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b\": container with ID starting with 6dc20b1430770f99c4e69680829b584a23bbaf8643b1536262d4ed83453b103b not found: ID does not exist" Apr 22 19:50:27.472637 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.472532 2576 scope.go:117] "RemoveContainer" containerID="7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7" Apr 22 19:50:27.472725 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:27.472706 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7\": container with ID starting with 7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7 not found: ID does not exist" containerID="7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7" Apr 22 19:50:27.472763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.472727 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7"} err="failed to get container status \"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7\": rpc error: code = NotFound desc = could not find container \"7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7\": container with ID starting with 7cb875f3e3a24006ce6ef2fb10396ccacc4a940e3af02e51e191bb6dfbbe1ff7 not found: ID does not exist" Apr 22 19:50:27.479049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.479028 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:50:27.483033 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:27.483015 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74ffbj7vbs"] Apr 22 19:50:28.756817 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:28.756785 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" path="/var/lib/kubelet/pods/72cb9b44-37dc-4f18-814f-30ba035d0569/volumes" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.520645 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521369 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521387 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521420 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521429 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521451 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521460 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521474 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521482 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521517 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="tokenizer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521527 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="tokenizer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521555 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521564 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="storage-initializer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521577 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521584 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521772 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521795 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="41637b2e-b65a-4995-9476-c1cf0502ce40" containerName="main" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521805 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72cb9b44-37dc-4f18-814f-30ba035d0569" containerName="tokenizer" Apr 22 19:50:40.525615 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.521816 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="90592f52-d190-4a9c-b724-26fa09066937" containerName="main" Apr 22 19:50:40.529010 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.528984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.534325 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.534304 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 19:50:40.543989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.543964 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:50:40.710984 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.710954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.711190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.711003 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.711190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.711098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bfl7\" (UniqueName: \"kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.711190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.711160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.711190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.711185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.711374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.711269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812032 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.811952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812032 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bfl7\" (UniqueName: \"kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812237 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812580 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812709 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.812709 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.812619 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.814381 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.814362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.814643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.814627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.821307 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.821284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bfl7\" (UniqueName: \"kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7\") pod \"router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.840069 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.840041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:40.974475 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:40.974451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:50:40.977228 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:50:40.977184 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b0b081_0e8b_4de5_8482_87297243aeae.slice/crio-e59c649c067cbb0de1c20710eb4604271a353eb11bd1db2a17d8d43fff962039 WatchSource:0}: Error finding container e59c649c067cbb0de1c20710eb4604271a353eb11bd1db2a17d8d43fff962039: Status 404 returned error can't find the container with id e59c649c067cbb0de1c20710eb4604271a353eb11bd1db2a17d8d43fff962039 Apr 22 19:50:41.013568 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.013543 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:50:41.017924 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.017905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.021146 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.021126 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-bzxph\"" Apr 22 19:50:41.028103 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.028082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:50:41.114893 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.114804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vc2\" (UniqueName: \"kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.114893 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.114846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.114893 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.114872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.115102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.114960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.115102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.115000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.115102 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.115045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216330 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216532 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216664 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vc2\" (UniqueName: \"kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216664 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216664 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216830 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216830 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.216936 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.216861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.217048 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.217024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.218945 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.218925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.225210 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.225188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vc2\" (UniqueName: \"kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2\") pod \"router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.328688 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.328651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:41.482955 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:50:41.482923 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f9b826_c792_41b8_a28c_9fc703563759.slice/crio-c9778fd9a4935afdfe0ca4d71a092607a60b194991ebd36f2334634054f5b762 WatchSource:0}: Error finding container c9778fd9a4935afdfe0ca4d71a092607a60b194991ebd36f2334634054f5b762: Status 404 returned error can't find the container with id c9778fd9a4935afdfe0ca4d71a092607a60b194991ebd36f2334634054f5b762 Apr 22 19:50:41.483449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.483424 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:50:41.499866 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.499833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerStarted","Data":"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73"} Apr 22 19:50:41.499997 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.499877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerStarted","Data":"e59c649c067cbb0de1c20710eb4604271a353eb11bd1db2a17d8d43fff962039"} Apr 22 19:50:41.501018 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:41.500996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerStarted","Data":"c9778fd9a4935afdfe0ca4d71a092607a60b194991ebd36f2334634054f5b762"} Apr 22 19:50:42.508538 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:42.508454 2576 generic.go:358] "Generic (PLEG): container finished" podID="46f9b826-c792-41b8-a28c-9fc703563759" containerID="8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7" exitCode=0 Apr 22 19:50:42.509242 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:42.508565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerDied","Data":"8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7"} Apr 22 19:50:43.515872 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:43.515841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerStarted","Data":"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b"} Apr 22 19:50:43.516333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:43.515898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerStarted","Data":"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2"} Apr 22 19:50:43.516333 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:43.516001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:43.542687 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:43.542627 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" podStartSLOduration=3.542609128 podStartE2EDuration="3.542609128s" podCreationTimestamp="2026-04-22 19:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:50:43.540565783 +0000 UTC m=+1593.325555653" watchObservedRunningTime="2026-04-22 19:50:43.542609128 +0000 UTC m=+1593.327599002" Apr 22 19:50:46.528587 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:46.528555 2576 generic.go:358] "Generic (PLEG): container finished" podID="90b0b081-0e8b-4de5-8482-87297243aeae" containerID="f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73" exitCode=0 Apr 22 19:50:46.528989 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:46.528644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerDied","Data":"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73"} Apr 22 19:50:47.534828 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:47.534794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerStarted","Data":"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235"} Apr 22 19:50:47.561356 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:47.561289 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podStartSLOduration=7.561273334 podStartE2EDuration="7.561273334s" podCreationTimestamp="2026-04-22 19:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:50:47.55890519 +0000 UTC m=+1597.343895056" watchObservedRunningTime="2026-04-22 19:50:47.561273334 +0000 UTC m=+1597.346263206" Apr 22 19:50:50.840529 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:50.840463 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:50.840529 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:50.840540 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:50:50.842172 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:50.842138 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:50:51.329469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:51.329419 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:51.329469 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:51.329457 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:51.332661 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:51.332634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:51.554599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:51.554568 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:50:55.644700 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:55.644663 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="llm-d-routing-sidecar" containerID="cri-o://34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" gracePeriod=2 Apr 22 19:50:55.957876 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:55.957849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs_91a258a7-24b8-4431-9958-dc5164f90c1f/main/0.log" Apr 22 19:50:55.958578 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:55.958559 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:50:56.055249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055218 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055249 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055253 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055448 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055418 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055593 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055464 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055593 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055529 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfrhb\" (UniqueName: \"kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb\") pod \"91a258a7-24b8-4431-9958-dc5164f90c1f\" (UID: \"91a258a7-24b8-4431-9958-dc5164f90c1f\") " Apr 22 19:50:56.055593 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache" (OuterVolumeSpecName: "model-cache") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:56.055892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home" (OuterVolumeSpecName: "home") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:56.055892 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.055869 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.057647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.057615 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb" (OuterVolumeSpecName: "kube-api-access-tfrhb") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "kube-api-access-tfrhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:50:56.057647 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.057640 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:50:56.057850 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.057826 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm" (OuterVolumeSpecName: "dshm") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:56.109423 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.109386 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91a258a7-24b8-4431-9958-dc5164f90c1f" (UID: "91a258a7-24b8-4431-9958-dc5164f90c1f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:56.156621 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.156592 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.156621 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.156619 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.156800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.156628 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91a258a7-24b8-4431-9958-dc5164f90c1f-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.156800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.156638 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfrhb\" (UniqueName: \"kubernetes.io/projected/91a258a7-24b8-4431-9958-dc5164f90c1f-kube-api-access-tfrhb\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.156800 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.156649 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91a258a7-24b8-4431-9958-dc5164f90c1f-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:50:56.571877 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.571851 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs_91a258a7-24b8-4431-9958-dc5164f90c1f/main/0.log" Apr 22 19:50:56.572547 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572497 2576 generic.go:358] "Generic (PLEG): container finished" podID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerID="7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" exitCode=137 Apr 22 19:50:56.572626 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572548 2576 generic.go:358] "Generic (PLEG): container finished" podID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerID="34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" exitCode=0 Apr 22 19:50:56.572626 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerDied","Data":"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c"} Apr 22 19:50:56.572711 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572627 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" Apr 22 19:50:56.572711 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerDied","Data":"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8"} Apr 22 19:50:56.572711 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs" event={"ID":"91a258a7-24b8-4431-9958-dc5164f90c1f","Type":"ContainerDied","Data":"82f827dba297ac42477dbfd52b65c18c0bd0fc7562ae6ef6b2d43a7c2ae1b906"} Apr 22 19:50:56.572711 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.572665 2576 scope.go:117] "RemoveContainer" containerID="7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" Apr 22 19:50:56.593099 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.593079 2576 scope.go:117] "RemoveContainer" containerID="4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c" Apr 22 19:50:56.602020 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.601993 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:50:56.604001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.603968 2576 scope.go:117] "RemoveContainer" containerID="34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" Apr 22 19:50:56.606439 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.606422 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-66fb4d6fd8-4n7gs"] Apr 22 19:50:56.612641 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.612620 2576 scope.go:117] "RemoveContainer" containerID="7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" Apr 22 19:50:56.612905 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:56.612882 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c\": container with ID starting with 7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c not found: ID does not exist" containerID="7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" Apr 22 19:50:56.612968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.612918 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c"} err="failed to get container status \"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c\": rpc error: code = NotFound desc = could not find container \"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c\": container with ID starting with 7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c not found: ID does not exist" Apr 22 19:50:56.612968 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.612946 2576 scope.go:117] "RemoveContainer" containerID="4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c" Apr 22 19:50:56.613252 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:56.613233 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c\": container with ID starting with 4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c not found: ID does not exist" containerID="4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c" Apr 22 19:50:56.613304 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613257 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c"} err="failed to get container status \"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c\": rpc error: code = NotFound desc = could not find container \"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c\": container with ID starting with 4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c not found: ID does not exist" Apr 22 19:50:56.613304 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613273 2576 scope.go:117] "RemoveContainer" containerID="34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" Apr 22 19:50:56.613562 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:50:56.613532 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8\": container with ID starting with 34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8 not found: ID does not exist" containerID="34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" Apr 22 19:50:56.613671 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613571 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8"} err="failed to get container status \"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8\": rpc error: code = NotFound desc = could not find container \"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8\": container with ID starting with 34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8 not found: ID does not exist" Apr 22 19:50:56.613671 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613593 2576 scope.go:117] "RemoveContainer" containerID="7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c" Apr 22 19:50:56.613861 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613840 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c"} err="failed to get container status \"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c\": rpc error: code = NotFound desc = could not find container \"7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c\": container with ID starting with 7d691befdef8ed2e003e8bc70ea403268d6f2c4441ea8de41581b0138945bb1c not found: ID does not exist" Apr 22 19:50:56.613918 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.613862 2576 scope.go:117] "RemoveContainer" containerID="4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c" Apr 22 19:50:56.614082 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.614060 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c"} err="failed to get container status \"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c\": rpc error: code = NotFound desc = could not find container \"4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c\": container with ID starting with 4e009b6e01f4a99ab46c0aedb3b1af1c1d0c3343dddfcf33e3c5d6367ce95b2c not found: ID does not exist" Apr 22 19:50:56.614082 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.614081 2576 scope.go:117] "RemoveContainer" containerID="34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8" Apr 22 19:50:56.614316 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.614295 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8"} err="failed to get container status \"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8\": rpc error: code = NotFound desc = could not find container \"34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8\": container with ID starting with 34514f2450251eadc6673d5c3d8f4819db01923339cc6bd79a9dcea6f5f925f8 not found: ID does not exist" Apr 22 19:50:56.761112 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:50:56.761079 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" path="/var/lib/kubelet/pods/91a258a7-24b8-4431-9958-dc5164f90c1f/volumes" Apr 22 19:51:00.840604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:00.840559 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:51:10.840785 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:10.840742 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:51:12.557793 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:12.557763 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:51:20.841013 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:20.840961 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:51:30.840725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:30.840677 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:51:40.841437 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:40.841385 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:51:50.840922 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:51:50.840881 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:52:00.841082 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:00.841037 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:52:10.840981 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:10.840940 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8000/health\": dial tcp 10.134.0.52:8000: connect: connection refused" Apr 22 19:52:20.851075 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:20.851041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:52:20.858650 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:20.858631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:52:42.369881 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.369794 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:52:42.370891 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.370744 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="main" containerID="cri-o://d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2" gracePeriod=30 Apr 22 19:52:42.370891 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.370802 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="tokenizer" containerID="cri-o://7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b" gracePeriod=30 Apr 22 19:52:42.372844 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.372817 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:52:42.373324 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.373271 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" containerID="cri-o://8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235" gracePeriod=30 Apr 22 19:52:42.556788 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:52:42.556754 2576 logging.go:55] [core] [Channel #349 SubChannel #350]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.53:9003", ServerName: "10.134.0.53:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.53:9003: connect: connection refused" Apr 22 19:52:42.963877 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.963840 2576 generic.go:358] "Generic (PLEG): container finished" podID="46f9b826-c792-41b8-a28c-9fc703563759" containerID="d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2" exitCode=0 Apr 22 19:52:42.964054 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:42.963887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerDied","Data":"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2"} Apr 22 19:52:43.525373 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.525352 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:52:43.557190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.557151 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.53:9003\" within 1s: context deadline exceeded" Apr 22 19:52:43.577432 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577405 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577436 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vc2\" (UniqueName: \"kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577460 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577570 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577536 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577590 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577718 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds\") pod \"46f9b826-c792-41b8-a28c-9fc703563759\" (UID: \"46f9b826-c792-41b8-a28c-9fc703563759\") " Apr 22 19:52:43.577742 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577726 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:43.577927 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577907 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:43.577979 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.577961 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:43.578079 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.578059 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-uds\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.578079 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.578077 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.578251 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.578088 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-tokenizer-tmp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.578338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.578315 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:52:43.579672 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.579647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2" (OuterVolumeSpecName: "kube-api-access-g6vc2") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "kube-api-access-g6vc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:52:43.579760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.579670 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "46f9b826-c792-41b8-a28c-9fc703563759" (UID: "46f9b826-c792-41b8-a28c-9fc703563759"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:52:43.678983 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.678906 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g6vc2\" (UniqueName: \"kubernetes.io/projected/46f9b826-c792-41b8-a28c-9fc703563759-kube-api-access-g6vc2\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.678983 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.678933 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/46f9b826-c792-41b8-a28c-9fc703563759-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.678983 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.678943 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/46f9b826-c792-41b8-a28c-9fc703563759-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:52:43.969028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.968938 2576 generic.go:358] "Generic (PLEG): container finished" podID="46f9b826-c792-41b8-a28c-9fc703563759" containerID="7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b" exitCode=0 Apr 22 19:52:43.969028 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.969007 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" Apr 22 19:52:43.969217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.969027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerDied","Data":"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b"} Apr 22 19:52:43.969217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.969066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4" event={"ID":"46f9b826-c792-41b8-a28c-9fc703563759","Type":"ContainerDied","Data":"c9778fd9a4935afdfe0ca4d71a092607a60b194991ebd36f2334634054f5b762"} Apr 22 19:52:43.969217 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.969085 2576 scope.go:117] "RemoveContainer" containerID="7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b" Apr 22 19:52:43.978667 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.978648 2576 scope.go:117] "RemoveContainer" containerID="d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2" Apr 22 19:52:43.986374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.986357 2576 scope.go:117] "RemoveContainer" containerID="8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7" Apr 22 19:52:43.991901 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.991880 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:52:43.994087 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994070 2576 scope.go:117] "RemoveContainer" containerID="7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b" Apr 22 19:52:43.994300 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:52:43.994284 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b\": container with ID starting with 7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b not found: ID does not exist" containerID="7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b" Apr 22 19:52:43.994345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994308 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b"} err="failed to get container status \"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b\": rpc error: code = NotFound desc = could not find container \"7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b\": container with ID starting with 7b0b4c34c77d4569c353fa8a15796da1e204fd8042993fb8d0683c3b60089a2b not found: ID does not exist" Apr 22 19:52:43.994345 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994324 2576 scope.go:117] "RemoveContainer" containerID="d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2" Apr 22 19:52:43.994576 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:52:43.994557 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2\": container with ID starting with d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2 not found: ID does not exist" containerID="d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2" Apr 22 19:52:43.994645 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994584 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2"} err="failed to get container status \"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2\": rpc error: code = NotFound desc = could not find container \"d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2\": container with ID starting with d1c016ed8d04cb7f607c6758ccf215ff37390d0cd324533c0b537112a0270ba2 not found: ID does not exist" Apr 22 19:52:43.994645 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994607 2576 scope.go:117] "RemoveContainer" containerID="8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7" Apr 22 19:52:43.994814 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:52:43.994800 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7\": container with ID starting with 8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7 not found: ID does not exist" containerID="8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7" Apr 22 19:52:43.994854 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.994816 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7"} err="failed to get container status \"8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7\": rpc error: code = NotFound desc = could not find container \"8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7\": container with ID starting with 8ee40826603441cfc8523b7fcc5f0881de44b6ab37d8da272d2a29252fa265d7 not found: ID does not exist" Apr 22 19:52:43.997402 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:43.997377 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-8944947d7rl2x4"] Apr 22 19:52:44.755246 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:44.755208 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f9b826-c792-41b8-a28c-9fc703563759" path="/var/lib/kubelet/pods/46f9b826-c792-41b8-a28c-9fc703563759/volumes" Apr 22 19:52:57.180658 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:57.180626 2576 ???:1] "http: TLS handshake error from 10.0.132.160:43776: EOF" Apr 22 19:52:57.184254 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:57.184228 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:52:57.203544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:57.203520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:52:58.192713 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:58.192684 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:52:58.199657 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:58.199636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:52:59.171979 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:59.171953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:52:59.178812 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:52:59.178789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:00.172795 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:00.172744 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:00.182773 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:00.182749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:01.172725 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:01.172694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:01.179841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:01.179816 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:02.179230 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:02.179197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:02.185072 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:02.185040 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:03.164758 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:03.164722 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:03.171221 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:03.171195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:04.164740 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:04.164715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:04.170982 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:04.170959 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:05.226458 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:05.226423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:05.233477 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:05.233456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:06.255384 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:06.255355 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:06.262472 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:06.262452 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:07.227483 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:07.227450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:07.233859 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:07.233833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:08.224176 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:08.224147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:08.230353 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:08.230335 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:09.186373 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:09.186343 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:09.192280 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:09.192255 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:10.178348 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:10.178319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/main/0.log" Apr 22 19:53:10.184889 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:10.184869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg_90b0b081-0e8b-4de5-8482-87297243aeae/storage-initializer/0.log" Apr 22 19:53:11.167841 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:11.167813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79d8474b76-rbnlg_9a37ca65-2a95-4238-9686-942fb21e4095/router/0.log" Apr 22 19:53:11.974403 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:11.974379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79d8474b76-rbnlg_9a37ca65-2a95-4238-9686-942fb21e4095/router/0.log" Apr 22 19:53:12.620847 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.620827 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:53:12.717763 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.717690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-7sxnb_30840fff-33dc-4c20-8ad1-42a64abe6b20/authorino/0.log" Apr 22 19:53:12.752987 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.752964 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753115 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753035 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753115 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753059 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753115 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753104 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bfl7\" (UniqueName: \"kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753242 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753146 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753242 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753166 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home\") pod \"90b0b081-0e8b-4de5-8482-87297243aeae\" (UID: \"90b0b081-0e8b-4de5-8482-87297243aeae\") " Apr 22 19:53:12.753363 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753338 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache" (OuterVolumeSpecName: "model-cache") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.753599 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753453 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-model-cache\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.753727 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.753702 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home" (OuterVolumeSpecName: "home") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.754109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.754087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-tbgzt_a2e34b1d-33ec-4538-a67b-c74df6e93564/kuadrant-console-plugin/0.log" Apr 22 19:53:12.755418 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.755389 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm" (OuterVolumeSpecName: "dshm") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.755573 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.755483 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7" (OuterVolumeSpecName: "kube-api-access-5bfl7") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "kube-api-access-5bfl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:53:12.755724 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.755705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:53:12.817640 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.817492 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "90b0b081-0e8b-4de5-8482-87297243aeae" (UID: "90b0b081-0e8b-4de5-8482-87297243aeae"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:53:12.854399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.854373 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/90b0b081-0e8b-4de5-8482-87297243aeae-tls-certs\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.854399 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.854401 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-home\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.854643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.854416 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-kserve-provision-location\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.854643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.854432 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/90b0b081-0e8b-4de5-8482-87297243aeae-dshm\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:12.854643 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:12.854446 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bfl7\" (UniqueName: \"kubernetes.io/projected/90b0b081-0e8b-4de5-8482-87297243aeae-kube-api-access-5bfl7\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 22 19:53:13.071610 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.071569 2576 generic.go:358] "Generic (PLEG): container finished" podID="90b0b081-0e8b-4de5-8482-87297243aeae" containerID="8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235" exitCode=137 Apr 22 19:53:13.072044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.071622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerDied","Data":"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235"} Apr 22 19:53:13.072044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.071656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" event={"ID":"90b0b081-0e8b-4de5-8482-87297243aeae","Type":"ContainerDied","Data":"e59c649c067cbb0de1c20710eb4604271a353eb11bd1db2a17d8d43fff962039"} Apr 22 19:53:13.072044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.071674 2576 scope.go:117] "RemoveContainer" containerID="8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235" Apr 22 19:53:13.072044 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.071678 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg" Apr 22 19:53:13.092954 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.092935 2576 scope.go:117] "RemoveContainer" containerID="f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73" Apr 22 19:53:13.094658 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.094635 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:53:13.098446 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.098418 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-68d9f8b6bf-4mjkg"] Apr 22 19:53:13.103258 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.103243 2576 scope.go:117] "RemoveContainer" containerID="8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235" Apr 22 19:53:13.103541 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:53:13.103491 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235\": container with ID starting with 8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235 not found: ID does not exist" containerID="8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235" Apr 22 19:53:13.103635 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.103553 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235"} err="failed to get container status \"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235\": rpc error: code = NotFound desc = could not find container \"8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235\": container with ID starting with 8f06c8e5f742d435318d5c0a31c346c7c9e055a5b01918981f7b9a1f0793d235 not found: ID does not exist" Apr 22 19:53:13.103635 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.103578 2576 scope.go:117] "RemoveContainer" containerID="f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73" Apr 22 19:53:13.103858 ip-10-0-132-160 kubenswrapper[2576]: E0422 19:53:13.103839 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73\": container with ID starting with f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73 not found: ID does not exist" containerID="f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73" Apr 22 19:53:13.103909 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:13.103864 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73"} err="failed to get container status \"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73\": rpc error: code = NotFound desc = could not find container \"f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73\": container with ID starting with f70540e8c8743edaa69333f63a6a43b6650f46474640360e239ff8f65a2b0b73 not found: ID does not exist" Apr 22 19:53:14.755659 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:14.755626 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" path="/var/lib/kubelet/pods/90b0b081-0e8b-4de5-8482-87297243aeae/volumes" Apr 22 19:53:18.005228 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:18.005195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8c65l_f9d8901e-a9f8-4dab-bc05-d442f52d1851/global-pull-secret-syncer/0.log" Apr 22 19:53:18.082520 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:18.082469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fkhrk_388a5b94-24f5-48ff-aa37-825c8e5a0b2a/konnectivity-agent/0.log" Apr 22 19:53:18.152305 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:18.152274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-160.ec2.internal_206ff994154571336dcc99880b36f4f2/haproxy/0.log" Apr 22 19:53:22.059640 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:22.059610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-7sxnb_30840fff-33dc-4c20-8ad1-42a64abe6b20/authorino/0.log" Apr 22 19:53:22.141857 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:22.141829 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-tbgzt_a2e34b1d-33ec-4538-a67b-c74df6e93564/kuadrant-console-plugin/0.log" Apr 22 19:53:23.206163 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.206133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/alertmanager/0.log" Apr 22 19:53:23.223948 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.223921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/config-reloader/0.log" Apr 22 19:53:23.240634 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.240606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/kube-rbac-proxy-web/0.log" Apr 22 19:53:23.258757 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.258728 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/kube-rbac-proxy/0.log" Apr 22 19:53:23.278696 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.278669 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/kube-rbac-proxy-metric/0.log" Apr 22 19:53:23.301449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.301424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/prom-label-proxy/0.log" Apr 22 19:53:23.324350 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.324327 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5743be61-1cf4-4c4b-94fc-857af4fd539e/init-config-reloader/0.log" Apr 22 19:53:23.366296 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.366251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-glpd4_d6b05d52-0d1e-4259-a36b-3b1d0e753715/cluster-monitoring-operator/0.log" Apr 22 19:53:23.463673 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.463606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-5f4fd5879d-4728c_5f026615-7d00-4280-9da6-68e7d4f3e23d/metrics-server/0.log" Apr 22 19:53:23.521449 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.521428 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4d486_b9bf8578-aad6-4b99-a551-6f6e222655c1/node-exporter/0.log" Apr 22 19:53:23.546525 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.546484 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4d486_b9bf8578-aad6-4b99-a551-6f6e222655c1/kube-rbac-proxy/0.log" Apr 22 19:53:23.565049 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:23.565027 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-4d486_b9bf8578-aad6-4b99-a551-6f6e222655c1/init-textfile/0.log" Apr 22 19:53:24.016953 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:24.016923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5966944dd6-gn5mz_1ee74941-d14c-462e-a4a9-e8ed9058b8dc/telemeter-client/0.log" Apr 22 19:53:24.034523 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:24.034480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5966944dd6-gn5mz_1ee74941-d14c-462e-a4a9-e8ed9058b8dc/reload/0.log" Apr 22 19:53:24.050607 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:24.050588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5966944dd6-gn5mz_1ee74941-d14c-462e-a4a9-e8ed9058b8dc/kube-rbac-proxy/0.log" Apr 22 19:53:26.044410 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.044380 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/1.log" Apr 22 19:53:26.049109 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.049088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-tzsmr_2a2eeae7-3634-4f74-a8ab-198bbd3ed2a4/console-operator/2.log" Apr 22 19:53:26.483544 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.483447 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-954dd468f-bvk7r_2f956b7a-e61e-497e-bf6a-382a5c64e13e/console/0.log" Apr 22 19:53:26.507389 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.507357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-dtw9f_a58a96e8-098b-416a-9c25-554a1abb6b1d/download-server/0.log" Apr 22 19:53:26.850921 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.850889 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq"] Apr 22 19:53:26.851286 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851271 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="tokenizer" Apr 22 19:53:26.851286 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851286 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="tokenizer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851295 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851301 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851320 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851336 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851347 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851353 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851360 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="llm-d-routing-sidecar" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851365 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="llm-d-routing-sidecar" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851374 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851379 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="storage-initializer" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851388 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="main" Apr 22 19:53:26.851412 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851393 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="main" Apr 22 19:53:26.851851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851438 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="90b0b081-0e8b-4de5-8482-87297243aeae" containerName="main" Apr 22 19:53:26.851851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851447 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="llm-d-routing-sidecar" Apr 22 19:53:26.851851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851455 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a258a7-24b8-4431-9958-dc5164f90c1f" containerName="main" Apr 22 19:53:26.851851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851462 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="main" Apr 22 19:53:26.851851 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.851470 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="46f9b826-c792-41b8-a28c-9fc703563759" containerName="tokenizer" Apr 22 19:53:26.856639 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.856620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.859019 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.858998 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"kube-root-ca.crt\"" Apr 22 19:53:26.859133 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.859114 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x642z\"/\"default-dockercfg-6r5dj\"" Apr 22 19:53:26.860116 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.860101 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x642z\"/\"openshift-service-ca.crt\"" Apr 22 19:53:26.863076 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.863055 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq"] Apr 22 19:53:26.880572 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.880548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-lib-modules\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.880689 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.880586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-sys\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.880689 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.880644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8tx\" (UniqueName: \"kubernetes.io/projected/122138c0-d509-4c45-b621-82cfcb119c43-kube-api-access-2m8tx\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.880774 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.880690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-podres\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.880774 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.880734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-proc\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982007 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.981974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-podres\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-proc\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-lib-modules\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-sys\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8tx\" (UniqueName: \"kubernetes.io/projected/122138c0-d509-4c45-b621-82cfcb119c43-kube-api-access-2m8tx\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982167 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-podres\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-proc\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-sys\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.982374 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.982203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/122138c0-d509-4c45-b621-82cfcb119c43-lib-modules\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:26.989760 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:26.989740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8tx\" (UniqueName: \"kubernetes.io/projected/122138c0-d509-4c45-b621-82cfcb119c43-kube-api-access-2m8tx\") pod \"perf-node-gather-daemonset-cwsrq\" (UID: \"122138c0-d509-4c45-b621-82cfcb119c43\") " pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:27.028133 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.028109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dj2zn_6ea4f413-e0ef-4b63-9f20-54a4234931a6/volume-data-source-validator/0.log" Apr 22 19:53:27.166877 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.166790 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:27.494869 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.494789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq"] Apr 22 19:53:27.497775 ip-10-0-132-160 kubenswrapper[2576]: W0422 19:53:27.497737 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod122138c0_d509_4c45_b621_82cfcb119c43.slice/crio-990b7cac2a6bda6d277fb6c4c86c186e5932bada9aa607ff2bc396b69eca2682 WatchSource:0}: Error finding container 990b7cac2a6bda6d277fb6c4c86c186e5932bada9aa607ff2bc396b69eca2682: Status 404 returned error can't find the container with id 990b7cac2a6bda6d277fb6c4c86c186e5932bada9aa607ff2bc396b69eca2682 Apr 22 19:53:27.499317 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.499296 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:53:27.727516 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.727469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhwkd_f8046910-c087-4b0d-a917-3216261f41d0/dns/0.log" Apr 22 19:53:27.742309 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.742287 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhwkd_f8046910-c087-4b0d-a917-3216261f41d0/kube-rbac-proxy/0.log" Apr 22 19:53:27.910710 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:27.910683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-frj6d_9f0eeacf-656e-4f8f-aa52-91ca36e9a6b6/dns-node-resolver/0.log" Apr 22 19:53:28.128190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.128157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" event={"ID":"122138c0-d509-4c45-b621-82cfcb119c43","Type":"ContainerStarted","Data":"3db44d34d77456714724ef8e82c54cdb30c56958916457197a638c6d81ac7ce9"} Apr 22 19:53:28.128190 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.128193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" event={"ID":"122138c0-d509-4c45-b621-82cfcb119c43","Type":"ContainerStarted","Data":"990b7cac2a6bda6d277fb6c4c86c186e5932bada9aa607ff2bc396b69eca2682"} Apr 22 19:53:28.128409 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.128223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:28.146203 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.146157 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" podStartSLOduration=2.146143835 podStartE2EDuration="2.146143835s" podCreationTimestamp="2026-04-22 19:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:53:28.144961014 +0000 UTC m=+1757.929950887" watchObservedRunningTime="2026-04-22 19:53:28.146143835 +0000 UTC m=+1757.931133741" Apr 22 19:53:28.366781 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.366746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-554b658566-jfndr_2c00bd78-14f8-4c6f-b6e5-e536a5b57c3e/registry/0.log" Apr 22 19:53:28.421942 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:28.421914 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-klmr4_c5c6ea94-6f38-4282-bf4f-2a7ad7fa513a/node-ca/0.log" Apr 22 19:53:29.265338 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:29.265309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79d8474b76-rbnlg_9a37ca65-2a95-4238-9686-942fb21e4095/router/0.log" Apr 22 19:53:29.686491 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:29.686409 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9tp5r_15547e0d-8f10-470f-a80b-0cb53add2696/serve-healthcheck-canary/0.log" Apr 22 19:53:30.116209 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:30.116180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b7xnl_1a23807b-8ae8-4f57-aaa4-f01cc3d1f680/insights-operator/0.log" Apr 22 19:53:30.116380 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:30.116270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-b7xnl_1a23807b-8ae8-4f57-aaa4-f01cc3d1f680/insights-operator/1.log" Apr 22 19:53:30.131972 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:30.131947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gg6h_52b31b6c-6072-4c90-8348-9510ed167d08/kube-rbac-proxy/0.log" Apr 22 19:53:30.148137 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:30.148110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gg6h_52b31b6c-6072-4c90-8348-9510ed167d08/exporter/0.log" Apr 22 19:53:30.163314 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:30.163292 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gg6h_52b31b6c-6072-4c90-8348-9510ed167d08/extractor/0.log" Apr 22 19:53:32.767299 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:32.767273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-98c76994c-4dslq_86a155d5-abb8-4ef0-9aa5-a2e58e3312df/manager/0.log" Apr 22 19:53:32.813476 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:32.813444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-wcs7r_b50d0c68-2415-4ac0-ac90-80fd5064f054/openshift-lws-operator/0.log" Apr 22 19:53:34.143456 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:34.143428 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x642z/perf-node-gather-daemonset-cwsrq" Apr 22 19:53:38.729228 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:38.729195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-788qs_89e967fc-c463-4f10-9b81-910499e78afc/kube-storage-version-migrator-operator/1.log" Apr 22 19:53:38.731119 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:38.731100 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-788qs_89e967fc-c463-4f10-9b81-910499e78afc/kube-storage-version-migrator-operator/0.log" Apr 22 19:53:39.610604 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.610569 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/kube-multus-additional-cni-plugins/0.log" Apr 22 19:53:39.626816 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.626792 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/egress-router-binary-copy/0.log" Apr 22 19:53:39.643166 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.643145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/cni-plugins/0.log" Apr 22 19:53:39.658791 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.658772 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/bond-cni-plugin/0.log" Apr 22 19:53:39.676889 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.676869 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/routeoverride-cni/0.log" Apr 22 19:53:39.717292 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.717271 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/whereabouts-cni-bincopy/0.log" Apr 22 19:53:39.731103 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:39.731081 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4q6q8_b2d3136a-c73d-4ecd-a4e9-a9c3c3605915/whereabouts-cni/0.log" Apr 22 19:53:40.094011 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:40.093982 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-prwb9_357c3ccf-e489-43f2-ae48-f23177ef4481/kube-multus/0.log" Apr 22 19:53:40.189123 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:40.189098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rqq85_342209dc-2f51-4fc2-a96f-a19424f86d57/network-metrics-daemon/0.log" Apr 22 19:53:40.204364 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:40.204339 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rqq85_342209dc-2f51-4fc2-a96f-a19424f86d57/kube-rbac-proxy/0.log" Apr 22 19:53:41.395284 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.395252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-controller/0.log" Apr 22 19:53:41.407001 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.406978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/0.log" Apr 22 19:53:41.414003 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.413982 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovn-acl-logging/1.log" Apr 22 19:53:41.429035 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.429010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/kube-rbac-proxy-node/0.log" Apr 22 19:53:41.444654 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.444636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:53:41.460486 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.460464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/northd/0.log" Apr 22 19:53:41.474386 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.474370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/nbdb/0.log" Apr 22 19:53:41.490860 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.490835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/sbdb/0.log" Apr 22 19:53:41.585107 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:41.585080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9f6tl_3a50980a-3501-4203-96f8-93510d032673/ovnkube-controller/0.log" Apr 22 19:53:42.895620 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:42.895542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sxrzv_52bb575c-7bf3-4562-b917-b5d06e683525/network-check-target-container/0.log" Apr 22 19:53:43.829348 ip-10-0-132-160 kubenswrapper[2576]: I0422 19:53:43.829321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-l2kj9_32f2d314-d41f-4b5f-990f-0750a69d5f47/iptables-alerter/0.log"