Apr 17 16:30:52.005399 ip-10-0-130-35 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:30:52.352333 ip-10-0-130-35 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:30:52.352333 ip-10-0-130-35 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:30:52.352333 ip-10-0-130-35 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:30:52.352333 ip-10-0-130-35 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:30:52.352333 ip-10-0-130-35 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:30:52.353711 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.353621 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:30:52.360101 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360064 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:30:52.360101 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360098 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:30:52.360101 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360102 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:30:52.360101 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360105 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:30:52.360101 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360108 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360112 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360115 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360118 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360121 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360124 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360127 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360130 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360132 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360135 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360138 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360140 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360143 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360146 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360148 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360151 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360156 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360159 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360171 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:30:52.360292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360174 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360177 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360180 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360183 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360186 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360189 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360192 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360195 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360198 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360200 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360203 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360205 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360209 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360211 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360214 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360217 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360220 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360223 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360225 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360228 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:30:52.360776 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360231 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360234 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360238 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360241 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360243 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360246 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360249 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360251 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360254 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360256 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360259 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360261 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360264 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360266 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360269 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360271 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360274 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360276 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360279 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360282 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:30:52.361282 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360285 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360287 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360290 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360293 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360296 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360299 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360303 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360306 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360308 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360311 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360314 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360316 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360319 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360322 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360325 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360328 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360332 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360335 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360338 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360341 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:30:52.361763 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360344 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360346 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360349 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360721 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360726 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360729 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360732 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360734 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360737 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360740 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360743 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360745 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360748 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360751 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360754 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360756 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360759 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360763 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360767 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:30:52.362292 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360770 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360773 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360777 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360779 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360782 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360785 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360788 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360791 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360794 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360797 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360799 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360802 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360804 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360807 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360809 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360812 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360815 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360817 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360819 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360822 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:30:52.362745 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360824 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360827 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360829 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360832 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360834 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360837 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360839 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360842 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360845 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360847 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360850 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360852 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360855 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360858 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360860 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360862 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360865 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360868 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360870 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360873 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:30:52.363366 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360875 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360884 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360887 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360890 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360892 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360895 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360898 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360900 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360903 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360905 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360908 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360910 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360913 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360916 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360918 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360921 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360924 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360927 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360929 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360932 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:30:52.363869 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360935 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360938 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360941 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360943 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360948 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360952 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360955 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360958 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360961 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.360964 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362171 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362182 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362190 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362194 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362200 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362203 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362208 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362212 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362216 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362219 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362223 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:30:52.364376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362226 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362230 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362233 2561 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362236 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362239 2561 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362241 2561 flags.go:64] FLAG: --cloud-config="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362244 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362247 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362251 2561 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362254 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362257 2561 flags.go:64] FLAG: --config-dir="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362260 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362263 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362268 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362271 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362274 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362278 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362281 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362284 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362287 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362290 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362293 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362298 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362301 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362304 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:30:52.364890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362307 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362311 2561 flags.go:64] FLAG: --enable-server="true" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362314 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362318 2561 flags.go:64] FLAG: --event-burst="100" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362322 2561 flags.go:64] FLAG: --event-qps="50" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362324 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362327 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362330 2561 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362334 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362337 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362340 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362343 2561 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362346 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362349 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362353 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362356 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362359 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362362 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362365 2561 flags.go:64] FLAG: --feature-gates="" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362369 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362372 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362375 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362379 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362382 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362385 2561 flags.go:64] FLAG: --help="false" Apr 17 16:30:52.365511 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362388 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362391 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362394 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362397 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362401 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362404 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362407 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362410 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362413 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362416 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362419 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362423 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362426 2561 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362428 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362431 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362434 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362437 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362440 2561 flags.go:64] FLAG: --lock-file="" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362443 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362446 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362449 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362455 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362458 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362462 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:30:52.366138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362464 2561 flags.go:64] FLAG: --logging-format="text" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362467 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362471 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362474 2561 flags.go:64] FLAG: --manifest-url="" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362477 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362481 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362484 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362489 2561 flags.go:64] FLAG: --max-pods="110" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362491 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362494 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362497 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362500 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362503 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362506 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362509 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362516 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362519 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362522 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362526 2561 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362529 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362535 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362538 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362541 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362544 2561 flags.go:64] FLAG: --port="10250" Apr 17 16:30:52.366729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362547 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362550 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-038c9fc9857ed204f" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362553 2561 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362556 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362559 2561 flags.go:64] FLAG: --register-node="true" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362562 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362565 2561 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362568 2561 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362571 2561 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362574 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362577 2561 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362580 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362583 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362586 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362589 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362592 2561 flags.go:64] FLAG: --runonce="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362594 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362597 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362600 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362603 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362606 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362609 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362612 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362615 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362617 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362620 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:30:52.367309 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362623 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362635 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362638 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362641 2561 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362644 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362650 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362652 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362655 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362659 2561 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362662 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362664 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362667 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362671 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362674 2561 flags.go:64] FLAG: --v="2" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362678 2561 flags.go:64] FLAG: --version="false" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362682 2561 flags.go:64] FLAG: --vmodule="" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362687 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.362690 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362781 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362785 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362788 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362791 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362794 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362797 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:30:52.367935 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362799 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362803 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362805 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362808 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362811 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362814 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362816 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362819 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362822 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362826 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362829 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362832 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362835 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362837 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362840 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362842 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362845 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362847 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362850 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362854 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:30:52.368514 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362857 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362859 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362862 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362864 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362867 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362869 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362872 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362875 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362877 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362879 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362882 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362884 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362887 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362889 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362892 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362894 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362897 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362899 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362902 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362906 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:30:52.369055 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362909 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362913 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362916 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362919 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362922 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362924 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362926 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362929 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362932 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362934 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362936 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362940 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362943 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362946 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362948 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362951 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362954 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362956 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362959 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362962 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:30:52.369548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362964 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362967 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362969 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362972 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362976 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362979 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362982 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362985 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362987 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362990 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362992 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362995 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.362997 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363001 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363004 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363006 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363009 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363012 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363014 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:30:52.370037 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.363017 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.363550 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.369895 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.369915 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369965 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369971 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369974 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369977 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369981 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369984 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369987 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369990 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369993 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369995 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.369998 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370001 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:30:52.370526 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370003 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370006 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370008 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370012 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370014 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370017 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370020 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370023 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370026 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370028 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370031 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370033 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370036 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370039 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370042 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370045 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370048 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370051 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370054 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:30:52.370926 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370057 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370060 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370064 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370067 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370070 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370092 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370095 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370098 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370100 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370103 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370105 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370108 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370110 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370113 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370115 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370118 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370120 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370123 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370125 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370128 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:30:52.371449 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370131 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370134 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370136 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370139 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370142 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370144 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370147 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370157 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370160 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370162 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370166 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370168 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370172 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370184 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370187 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370190 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370192 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370195 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370198 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370200 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:30:52.371930 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370203 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370206 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370209 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370211 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370214 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370216 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370220 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370224 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370226 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370229 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370232 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370235 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370238 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370240 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370243 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:30:52.372474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.370248 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370370 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370376 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370380 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370383 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370386 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370389 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370392 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370395 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370398 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370401 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370405 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370407 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370410 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370413 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370415 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370418 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370421 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370423 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:30:52.372855 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370426 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370429 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370431 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370434 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370437 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370439 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370442 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370445 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370447 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370450 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370452 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370455 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370457 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370460 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370462 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370464 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370468 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370471 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370474 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370477 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:30:52.373319 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370480 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370483 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370485 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370488 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370491 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370494 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370497 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370499 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370501 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370504 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370507 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370509 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370512 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370514 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370517 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370519 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370521 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370524 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370527 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370529 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:30:52.373817 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370532 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370535 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370537 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370540 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370543 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370545 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370548 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370550 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370553 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370555 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370557 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370560 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370562 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370565 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370567 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370570 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370573 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370575 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370578 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370580 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:30:52.374314 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370583 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370585 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370588 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370590 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370593 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370595 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370598 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:52.370600 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.370604 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:30:52.374804 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.371244 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:30:52.375822 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.375700 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:30:52.376618 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.376606 2561 server.go:1019] "Starting client certificate rotation" Apr 17 16:30:52.376717 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.376702 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:30:52.377082 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.377061 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:30:52.397168 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.397148 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:30:52.402633 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.402341 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:30:52.410218 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.410201 2561 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:30:52.415205 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.415189 2561 log.go:25] "Validated CRI v1 image API" Apr 17 16:30:52.417564 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.417549 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:30:52.420789 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.420761 2561 fs.go:135] Filesystem UUIDs: map[2e335cb8-1528-48cd-ae1e-428f0ee14c57:/dev/nvme0n1p4 45262ba8-96f3-41b7-b9b2-0b782f6c4333:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 16:30:52.420844 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.420788 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:30:52.424885 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.424867 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:30:52.426094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.425968 2561 manager.go:217] Machine: {Timestamp:2026-04-17 16:30:52.424601554 +0000 UTC m=+0.322363000 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099153 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25baaeb7a062f0ffbeb4de29459881 SystemUUID:ec25baae-b7a0-62f0-ffbe-b4de29459881 BootID:24b58f20-1303-47f8-abf7-fa3f1fc41360 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4d:31:48:f9:43 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4d:31:48:f9:43 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:d0:cc:47:57:95 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:30:52.426094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.426089 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:30:52.426215 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.426186 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:30:52.427651 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.427622 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:30:52.427803 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.427653 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-35.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:30:52.427852 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.427810 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:30:52.427852 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.427818 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:30:52.427852 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.427831 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:30:52.428633 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.428622 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:30:52.429413 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.429404 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:30:52.429526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.429517 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:30:52.431340 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.431329 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:30:52.431382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.431345 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:30:52.431382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.431358 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:30:52.431382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.431368 2561 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:30:52.431382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.431378 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:30:52.432290 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.432277 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:30:52.432340 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.432295 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:30:52.434766 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.434748 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:30:52.436253 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.436239 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:30:52.437887 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437869 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:30:52.437887 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437894 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437907 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437918 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437930 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437941 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437950 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437960 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437976 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.437986 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.438013 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:30:52.438024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.438025 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:30:52.438708 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.438696 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:30:52.438741 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.438711 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:30:52.442537 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.442523 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:30:52.442601 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.442562 2561 server.go:1295] "Started kubelet" Apr 17 16:30:52.442660 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.442627 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:30:52.442742 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.442706 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:30:52.443549 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.443511 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:30:52.444334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.444270 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-35.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:30:52.444325 ip-10-0-130-35 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:30:52.444606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.444589 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:30:52.444771 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.444614 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-35.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:30:52.445439 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.445385 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:30:52.449919 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.449904 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:30:52.452309 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.451501 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-35.ec2.internal.18a731e87267e526 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-35.ec2.internal,UID:ip-10-0-130-35.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-35.ec2.internal,},FirstTimestamp:2026-04-17 16:30:52.442535206 +0000 UTC m=+0.340296652,LastTimestamp:2026-04-17 16:30:52.442535206 +0000 UTC m=+0.340296652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-35.ec2.internal,}" Apr 17 16:30:52.454262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454246 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:30:52.454262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454258 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:30:52.454794 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454778 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:30:52.454794 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454782 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:30:52.454794 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454799 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:30:52.454955 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454894 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:30:52.454955 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.454900 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:30:52.455052 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.454963 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.455170 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455151 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:30:52.455245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455171 2561 factory.go:55] Registering systemd factory Apr 17 16:30:52.455245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455181 2561 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:30:52.455474 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.455453 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:30:52.455474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455470 2561 factory.go:153] Registering CRI-O factory Apr 17 16:30:52.455579 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455483 2561 factory.go:223] Registration of the crio container factory successfully Apr 17 16:30:52.455579 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455499 2561 factory.go:103] Registering Raw factory Apr 17 16:30:52.455579 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455509 2561 manager.go:1196] Started watching for new ooms in manager Apr 17 16:30:52.455825 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.455815 2561 manager.go:319] Starting recovery of all containers Apr 17 16:30:52.465815 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.465797 2561 manager.go:324] Recovery completed Apr 17 16:30:52.466711 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.466682 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-35.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 16:30:52.466711 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.466687 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 16:30:52.471653 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.471640 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.472908 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.472889 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n42p9" Apr 17 16:30:52.477224 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.477182 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.477312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.477246 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.477312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.477297 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.478620 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.478600 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:30:52.478620 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.478614 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:30:52.478728 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.478668 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:30:52.479484 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.479420 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-35.ec2.internal.18a731e874791de1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-35.ec2.internal,UID:ip-10-0-130-35.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-35.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-35.ec2.internal,},FirstTimestamp:2026-04-17 16:30:52.477218273 +0000 UTC m=+0.374979723,LastTimestamp:2026-04-17 16:30:52.477218273 +0000 UTC m=+0.374979723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-35.ec2.internal,}" Apr 17 16:30:52.479962 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.479947 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n42p9" Apr 17 16:30:52.481973 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.481960 2561 policy_none.go:49] "None policy: Start" Apr 17 16:30:52.482014 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.481977 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:30:52.482014 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.481986 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.516460 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.517650 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.517676 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.517700 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.517709 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.517750 2561 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.521247 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.524698 2561 manager.go:341] "Starting Device Plugin manager" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.524727 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.524737 2561 server.go:85] "Starting device plugin registration server" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.525023 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.525034 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.525156 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.525232 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.525239 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.525699 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:30:52.531527 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.525734 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.618451 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.618347 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal"] Apr 17 16:30:52.618451 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.618450 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.619803 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.619789 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.619851 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.619820 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.619851 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.619833 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.622283 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.622270 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.622438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.622415 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.622438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.622444 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.624250 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624235 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.624250 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624244 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.624376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624263 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.624376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624270 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.624376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624278 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.624376 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.624284 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.625113 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.625100 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.625989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.625964 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.626061 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.625993 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.626061 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.626006 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.626061 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.626031 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.626638 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.626622 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.626690 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.626657 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:30:52.627449 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.627430 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:30:52.627520 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.627462 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:30:52.627520 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.627477 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:30:52.632264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.632249 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.632333 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.632269 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-35.ec2.internal\": node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.651847 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.651819 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-35.ec2.internal\" not found" node="ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.654100 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.654066 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.655602 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.655582 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.655696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.655604 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.655696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.655625 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d11f5615e8652fc4b9c2e3bdd1d65617-config\") pod \"kube-apiserver-proxy-ip-10-0-130-35.ec2.internal\" (UID: \"d11f5615e8652fc4b9c2e3bdd1d65617\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.656262 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.656246 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-35.ec2.internal\" not found" node="ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.754756 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.754731 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.755870 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.755856 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.755923 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.755882 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d11f5615e8652fc4b9c2e3bdd1d65617-config\") pod \"kube-apiserver-proxy-ip-10-0-130-35.ec2.internal\" (UID: \"d11f5615e8652fc4b9c2e3bdd1d65617\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.755957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.755936 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.755957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.755952 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d11f5615e8652fc4b9c2e3bdd1d65617-config\") pod \"kube-apiserver-proxy-ip-10-0-130-35.ec2.internal\" (UID: \"d11f5615e8652fc4b9c2e3bdd1d65617\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.756027 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.755982 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.756027 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.756016 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1918300f3765a6a348983233f548f4f5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal\" (UID: \"1918300f3765a6a348983233f548f4f5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.855362 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.855314 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.953953 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.953899 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:52.956709 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:52.956375 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:52.958555 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:52.958539 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:53.057426 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.057386 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.157946 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.157911 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.258390 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.258313 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.302452 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.302420 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:53.359038 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.359002 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.376265 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.376232 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:30:53.376414 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.376395 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:30:53.376481 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.376438 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:30:53.454624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.454598 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:30:53.459127 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.459102 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.463484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.463459 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:30:53.482204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.482154 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:25:52 +0000 UTC" deadline="2027-09-28 03:15:14.768960902 +0000 UTC" Apr 17 16:30:53.482204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.482195 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12682h44m21.286769044s" Apr 17 16:30:53.488338 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.488313 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zbjhc" Apr 17 16:30:53.497273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.497251 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zbjhc" Apr 17 16:30:53.560087 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.560007 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.617639 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:53.617597 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1918300f3765a6a348983233f548f4f5.slice/crio-79c3810b73a4ced1b1c35bceeb1e12ca8237c3dea9c31661f9d3648be3501d69 WatchSource:0}: Error finding container 79c3810b73a4ced1b1c35bceeb1e12ca8237c3dea9c31661f9d3648be3501d69: Status 404 returned error can't find the container with id 79c3810b73a4ced1b1c35bceeb1e12ca8237c3dea9c31661f9d3648be3501d69 Apr 17 16:30:53.617965 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:53.617946 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11f5615e8652fc4b9c2e3bdd1d65617.slice/crio-9d8decf7de03f8c07bca9b9e9b93c4ef661ad3ff9e32154cf1da814b9a6d447d WatchSource:0}: Error finding container 9d8decf7de03f8c07bca9b9e9b93c4ef661ad3ff9e32154cf1da814b9a6d447d: Status 404 returned error can't find the container with id 9d8decf7de03f8c07bca9b9e9b93c4ef661ad3ff9e32154cf1da814b9a6d447d Apr 17 16:30:53.622222 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.622206 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:30:53.661001 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.660963 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.761388 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:53.761362 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-35.ec2.internal\" not found" Apr 17 16:30:53.766048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.766030 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:53.778702 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.778675 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:53.854803 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.854716 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" Apr 17 16:30:53.868347 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.868324 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:30:53.869055 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.869042 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" Apr 17 16:30:53.874958 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:53.874943 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:30:54.432229 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.432195 2561 apiserver.go:52] "Watching apiserver" Apr 17 16:30:54.440216 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.440181 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:30:54.441803 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.441763 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal","openshift-dns/node-resolver-lpfgv","openshift-image-registry/node-ca-chsm2","openshift-multus/multus-additional-cni-plugins-642q4","openshift-multus/network-metrics-daemon-wdt9k","openshift-network-diagnostics/network-check-target-rhztb","openshift-ovn-kubernetes/ovnkube-node-qkt5m","kube-system/konnectivity-agent-hvbjq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k","openshift-cluster-node-tuning-operator/tuned-r45ll","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal","openshift-multus/multus-wgvnr","openshift-network-operator/iptables-alerter-gxtn5"] Apr 17 16:30:54.444491 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.444465 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.446669 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.446646 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.446954 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.446911 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:30:54.447043 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.446991 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whnrn\"" Apr 17 16:30:54.447133 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.447048 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:30:54.447260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.447242 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.447317 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.447268 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:30:54.447372 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.447322 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:30:54.447421 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.447385 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.448944 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.448892 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.449042 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.448997 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.449042 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.449023 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8lg5q\"" Apr 17 16:30:54.449364 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.449339 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.451090 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.451057 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.451190 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.451181 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-llzzq\"" Apr 17 16:30:54.451388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.451341 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:30:54.451472 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.451396 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.451527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.451466 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.453781 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.453758 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.454412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.453891 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zwp8b\"" Apr 17 16:30:54.454412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.453978 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.454412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.454120 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:30:54.454412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.454205 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:30:54.454412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.454309 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:30:54.456934 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.456664 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:54.457576 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.457550 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:30:54.461461 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.461325 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.461461 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.461421 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.461461 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.461428 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:30:54.463674 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.463653 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:30:54.463788 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.463769 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:30:54.463952 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.463938 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.464171 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.463691 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7kkll\"" Apr 17 16:30:54.465822 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l26dr\" (UniqueName: \"kubernetes.io/projected/f0164bdb-ef92-4743-91fc-03f010abe474-kube-api-access-l26dr\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.465922 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465836 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65e080a5-8430-43f5-b120-a3fff8102219-tmp-dir\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.465922 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465862 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-os-release\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465919 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-systemd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465951 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-log-socket\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.465977 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-netd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466001 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-slash\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466017 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.466028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466025 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-var-lib-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466050 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466051 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466169 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466181 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6pmsf\"" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466201 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466230 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t467v\" (UniqueName: \"kubernetes.io/projected/8a91f76e-d64e-4d72-92ff-c27c12f465d2-kube-api-access-t467v\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466256 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-node-log\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466279 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/65e080a5-8430-43f5-b120-a3fff8102219-hosts-file\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466297 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466321 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-kubelet\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466344 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-netns\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466362 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-bin\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466376 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05559253-f52c-49e6-a8e0-1751350669ac-host\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466392 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466418 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-ovn\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466440 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0164bdb-ef92-4743-91fc-03f010abe474-ovn-node-metrics-cert\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466492 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9qk\" (UniqueName: \"kubernetes.io/projected/65e080a5-8430-43f5-b120-a3fff8102219-kube-api-access-zs9qk\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466521 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05559253-f52c-49e6-a8e0-1751350669ac-serviceca\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466572 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466606 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-systemd-units\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466637 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-etc-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466669 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466698 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-cnibin\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466723 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466747 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-script-lib\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.466790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466770 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hqhq\" (UniqueName: \"kubernetes.io/projected/05559253-f52c-49e6-a8e0-1751350669ac-kube-api-access-2hqhq\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.467474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466808 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-system-cni-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.467474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466824 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-binary-copy\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.467474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466838 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq47t\" (UniqueName: \"kubernetes.io/projected/8fbceac7-0307-49d3-8986-e1b49a4b6760-kube-api-access-mq47t\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.467474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466884 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-config\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.467474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.466915 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-env-overrides\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.468522 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.468502 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.468607 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.468596 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.470598 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.470578 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:30:54.470909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.470890 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.470991 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.470911 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d5k84\"" Apr 17 16:30:54.470991 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.470984 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-csnw6\"" Apr 17 16:30:54.471110 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.470890 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.471483 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.471202 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.471483 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.471249 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:30:54.473311 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.473279 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:30:54.473403 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.473340 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:30:54.473536 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.473501 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:30:54.473636 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.473601 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lv9n7\"" Apr 17 16:30:54.498652 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.498608 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:25:53 +0000 UTC" deadline="2027-10-29 13:44:51.331367928 +0000 UTC" Apr 17 16:30:54.498652 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.498638 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13437h13m56.832732837s" Apr 17 16:30:54.522885 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.522836 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" event={"ID":"d11f5615e8652fc4b9c2e3bdd1d65617","Type":"ContainerStarted","Data":"9d8decf7de03f8c07bca9b9e9b93c4ef661ad3ff9e32154cf1da814b9a6d447d"} Apr 17 16:30:54.524719 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.524690 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" event={"ID":"1918300f3765a6a348983233f548f4f5","Type":"ContainerStarted","Data":"79c3810b73a4ced1b1c35bceeb1e12ca8237c3dea9c31661f9d3648be3501d69"} Apr 17 16:30:54.555885 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.555859 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:30:54.567167 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567131 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-ovn\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567167 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567181 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0164bdb-ef92-4743-91fc-03f010abe474-ovn-node-metrics-cert\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567209 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9qk\" (UniqueName: \"kubernetes.io/projected/65e080a5-8430-43f5-b120-a3fff8102219-kube-api-access-zs9qk\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567233 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05559253-f52c-49e6-a8e0-1751350669ac-serviceca\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567244 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-ovn\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567261 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysconfig\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567306 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-netns\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567333 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-socket-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567363 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-etc-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567388 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-sys\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.567417 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567413 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-sys-fs\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567439 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp8sx\" (UniqueName: \"kubernetes.io/projected/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kube-api-access-vp8sx\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567471 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-system-cni-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567476 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-etc-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567495 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567523 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-kubelet\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567570 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4lw8\" (UniqueName: \"kubernetes.io/projected/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-kube-api-access-r4lw8\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567591 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-system-cni-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567609 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e1e0a80-e552-4d28-b66e-b268268577f9-host-slash\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567638 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-config\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567663 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-env-overrides\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567689 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l26dr\" (UniqueName: \"kubernetes.io/projected/f0164bdb-ef92-4743-91fc-03f010abe474-kube-api-access-l26dr\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567676 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567719 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-hostroot\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567745 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-multus-certs\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567770 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8v9z\" (UniqueName: \"kubernetes.io/projected/9e1e0a80-e552-4d28-b66e-b268268577f9-kube-api-access-s8v9z\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567796 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-log-socket\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.567816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567821 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-netd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567846 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05559253-f52c-49e6-a8e0-1751350669ac-serviceca\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567870 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-systemd\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567898 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-run\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.567931 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-system-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568134 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-log-socket\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568175 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-os-release\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568179 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-netd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568216 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-multus\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568247 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-etc-selinux\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568285 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrq8j\" (UniqueName: \"kubernetes.io/projected/74c7418f-4df1-4f6f-ac48-8530556a0db1-kube-api-access-wrq8j\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568323 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568348 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-kubernetes\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568376 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-host\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568431 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-node-log\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568473 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-node-log\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568443 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.568600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568489 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/65e080a5-8430-43f5-b120-a3fff8102219-hosts-file\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568491 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-config\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568527 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568539 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/65e080a5-8430-43f5-b120-a3fff8102219-hosts-file\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568567 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-tuned\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568595 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568640 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-bin\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568680 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-modprobe-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568700 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-cni-bin\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568742 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-k8s-cni-cncf-io\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568769 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-registration-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568801 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568809 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-env-overrides\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568861 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-tmp\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568890 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-bin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568915 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e1e0a80-e552-4d28-b66e-b268268577f9-iptables-alerter-script\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568944 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-systemd-units\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.569381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568947 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-tuning-conf-dir\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.568972 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569015 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-cnibin\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-systemd-units\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569043 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569087 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569111 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cni-binary-copy\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569122 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-cnibin\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569164 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-etc-kubernetes\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569195 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569230 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-script-lib\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569277 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hqhq\" (UniqueName: \"kubernetes.io/projected/05559253-f52c-49e6-a8e0-1751350669ac-kube-api-access-2hqhq\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569312 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-binary-copy\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569338 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq47t\" (UniqueName: \"kubernetes.io/projected/8fbceac7-0307-49d3-8986-e1b49a4b6760-kube-api-access-mq47t\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569365 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cnibin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569390 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-conf-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569416 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65e080a5-8430-43f5-b120-a3fff8102219-tmp-dir\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.570040 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569439 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-os-release\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569463 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569502 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-device-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569555 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-systemd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569580 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-var-lib-kubelet\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569602 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-slash\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569623 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-var-lib-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569645 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569676 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569700 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t467v\" (UniqueName: \"kubernetes.io/projected/8a91f76e-d64e-4d72-92ff-c27c12f465d2-kube-api-access-t467v\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569710 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0164bdb-ef92-4743-91fc-03f010abe474-ovnkube-script-lib\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569727 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46096eec-9a89-49c4-a719-dad5e2d71c2f-agent-certs\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569767 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-systemd\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569776 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-run-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569813 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-slash\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569853 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46096eec-9a89-49c4-a719-dad5e2d71c2f-konnectivity-ca\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569883 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fbceac7-0307-49d3-8986-e1b49a4b6760-os-release\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.570833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569893 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-conf\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569919 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-lib-modules\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569943 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-socket-dir-parent\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.569973 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-kubelet\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570014 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-netns\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570047 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05559253-f52c-49e6-a8e0-1751350669ac-host\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570092 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570120 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-daemon-config\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.570213 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570221 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-kubelet\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.570312 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:30:55.070282746 +0000 UTC m=+2.968044180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570477 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65e080a5-8430-43f5-b120-a3fff8102219-tmp-dir\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570570 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-host-run-netns\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570656 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0164bdb-ef92-4743-91fc-03f010abe474-var-lib-openvswitch\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.570821 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05559253-f52c-49e6-a8e0-1751350669ac-host\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.571010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.571128 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fbceac7-0307-49d3-8986-e1b49a4b6760-cni-binary-copy\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.571529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.571362 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0164bdb-ef92-4743-91fc-03f010abe474-ovn-node-metrics-cert\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.575409 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.575383 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9qk\" (UniqueName: \"kubernetes.io/projected/65e080a5-8430-43f5-b120-a3fff8102219-kube-api-access-zs9qk\") pod \"node-resolver-lpfgv\" (UID: \"65e080a5-8430-43f5-b120-a3fff8102219\") " pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.576369 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.576348 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l26dr\" (UniqueName: \"kubernetes.io/projected/f0164bdb-ef92-4743-91fc-03f010abe474-kube-api-access-l26dr\") pod \"ovnkube-node-qkt5m\" (UID: \"f0164bdb-ef92-4743-91fc-03f010abe474\") " pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.580254 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.580225 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:30:54.580254 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.580244 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:30:54.580254 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.580256 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:54.580464 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:54.580319 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:30:55.080301639 +0000 UTC m=+2.978063072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:54.581967 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.581944 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t467v\" (UniqueName: \"kubernetes.io/projected/8a91f76e-d64e-4d72-92ff-c27c12f465d2-kube-api-access-t467v\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:54.582841 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.582817 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hqhq\" (UniqueName: \"kubernetes.io/projected/05559253-f52c-49e6-a8e0-1751350669ac-kube-api-access-2hqhq\") pod \"node-ca-chsm2\" (UID: \"05559253-f52c-49e6-a8e0-1751350669ac\") " pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.582964 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.582943 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq47t\" (UniqueName: \"kubernetes.io/projected/8fbceac7-0307-49d3-8986-e1b49a4b6760-kube-api-access-mq47t\") pod \"multus-additional-cni-plugins-642q4\" (UID: \"8fbceac7-0307-49d3-8986-e1b49a4b6760\") " pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.670687 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670655 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46096eec-9a89-49c4-a719-dad5e2d71c2f-agent-certs\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.670687 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670690 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46096eec-9a89-49c4-a719-dad5e2d71c2f-konnectivity-ca\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670723 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-conf\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670749 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-lib-modules\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670774 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-socket-dir-parent\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670815 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-daemon-config\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670838 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysconfig\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670860 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-netns\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670887 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-socket-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670913 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-conf\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670928 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-sys\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.670975 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-sys-fs\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.671006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671001 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp8sx\" (UniqueName: \"kubernetes.io/projected/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kube-api-access-vp8sx\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671027 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671174 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysctl-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671177 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-lib-modules\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671235 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-kubelet\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671239 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-sysconfig\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671274 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4lw8\" (UniqueName: \"kubernetes.io/projected/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-kube-api-access-r4lw8\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671309 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-sys-fs\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671305 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e1e0a80-e552-4d28-b66e-b268268577f9-host-slash\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671357 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-hostroot\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671375 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46096eec-9a89-49c4-a719-dad5e2d71c2f-konnectivity-ca\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671386 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-multus-certs\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671359 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-sys\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671414 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8v9z\" (UniqueName: \"kubernetes.io/projected/9e1e0a80-e552-4d28-b66e-b268268577f9-kube-api-access-s8v9z\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671464 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-socket-dir-parent\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671505 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-netns\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.671531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671504 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-daemon-config\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671557 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e1e0a80-e552-4d28-b66e-b268268577f9-host-slash\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671565 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-hostroot\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671548 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-socket-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671618 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-multus-certs\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671660 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-systemd\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671667 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-kubelet\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671707 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-run\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-system-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671754 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-os-release\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671756 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-systemd\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671773 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-run\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671822 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-system-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671834 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-multus\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671863 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-etc-selinux\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671892 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq8j\" (UniqueName: \"kubernetes.io/projected/74c7418f-4df1-4f6f-ac48-8530556a0db1-kube-api-access-wrq8j\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671896 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-multus\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671903 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-os-release\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.672271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671911 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-etc-selinux\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671917 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-kubernetes\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671952 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-host\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.671998 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-host\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672024 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-kubernetes\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672043 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-tuned\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672088 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672115 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-modprobe-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672168 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-cni-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672192 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-k8s-cni-cncf-io\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672200 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-modprobe-d\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672235 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-registration-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672291 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-run-k8s-cni-cncf-io\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672300 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-tmp\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672347 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-registration-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672383 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-bin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672421 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e1e0a80-e552-4d28-b66e-b268268577f9-iptables-alerter-script\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672449 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cni-binary-copy\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673051 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672473 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-etc-kubernetes\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672504 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cnibin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672532 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-conf-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672560 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672585 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-device-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672611 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-var-lib-kubelet\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672695 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74c7418f-4df1-4f6f-ac48-8530556a0db1-var-lib-kubelet\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672742 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cnibin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672736 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-host-var-lib-cni-bin\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672781 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-multus-conf-dir\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672822 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672855 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-etc-kubernetes\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.672901 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f962288b-adc2-4e12-9bcf-6ebff0796c4d-device-dir\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.673032 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-cni-binary-copy\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.673883 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.673052 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9e1e0a80-e552-4d28-b66e-b268268577f9-iptables-alerter-script\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.674626 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.674344 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46096eec-9a89-49c4-a719-dad5e2d71c2f-agent-certs\") pod \"konnectivity-agent-hvbjq\" (UID: \"46096eec-9a89-49c4-a719-dad5e2d71c2f\") " pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.674626 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.674444 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-etc-tuned\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.675774 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.675742 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74c7418f-4df1-4f6f-ac48-8530556a0db1-tmp\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.679744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.679721 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4lw8\" (UniqueName: \"kubernetes.io/projected/c3d773c7-2b0f-4dff-a3ec-72f61e88111c-kube-api-access-r4lw8\") pod \"multus-wgvnr\" (UID: \"c3d773c7-2b0f-4dff-a3ec-72f61e88111c\") " pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.680217 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.680194 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp8sx\" (UniqueName: \"kubernetes.io/projected/f962288b-adc2-4e12-9bcf-6ebff0796c4d-kube-api-access-vp8sx\") pod \"aws-ebs-csi-driver-node-66g2k\" (UID: \"f962288b-adc2-4e12-9bcf-6ebff0796c4d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:54.680329 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.680301 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq8j\" (UniqueName: \"kubernetes.io/projected/74c7418f-4df1-4f6f-ac48-8530556a0db1-kube-api-access-wrq8j\") pod \"tuned-r45ll\" (UID: \"74c7418f-4df1-4f6f-ac48-8530556a0db1\") " pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.680394 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.680301 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8v9z\" (UniqueName: \"kubernetes.io/projected/9e1e0a80-e552-4d28-b66e-b268268577f9-kube-api-access-s8v9z\") pod \"iptables-alerter-gxtn5\" (UID: \"9e1e0a80-e552-4d28-b66e-b268268577f9\") " pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.732410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.732341 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:54.758097 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.757931 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:30:54.774363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.774335 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lpfgv" Apr 17 16:30:54.782186 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.782162 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-chsm2" Apr 17 16:30:54.787729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.787708 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-642q4" Apr 17 16:30:54.795389 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.795371 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:30:54.803000 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.802981 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r45ll" Apr 17 16:30:54.810593 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.810575 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgvnr" Apr 17 16:30:54.818197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.818177 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gxtn5" Apr 17 16:30:54.824717 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:54.824699 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" Apr 17 16:30:55.074987 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.074893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:55.075176 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.075067 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:55.075176 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.075159 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:30:56.07513975 +0000 UTC m=+3.972901187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:55.175300 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.175259 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:55.175490 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.175442 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:30:55.175490 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.175466 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:30:55.175490 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.175478 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:55.175633 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:55.175546 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:30:56.175528095 +0000 UTC m=+4.073289548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:55.453831 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.453803 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3d773c7_2b0f_4dff_a3ec_72f61e88111c.slice/crio-b531dc4dd8b926403596d02295c6574e0ff9b2bc86b001a0cc0d336e96c84fdb WatchSource:0}: Error finding container b531dc4dd8b926403596d02295c6574e0ff9b2bc86b001a0cc0d336e96c84fdb: Status 404 returned error can't find the container with id b531dc4dd8b926403596d02295c6574e0ff9b2bc86b001a0cc0d336e96c84fdb Apr 17 16:30:55.455102 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.455060 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05559253_f52c_49e6_a8e0_1751350669ac.slice/crio-86e7ae7ac1415ab671d1ce8e302e74f734ae3834c47f08889711b426f302e10b WatchSource:0}: Error finding container 86e7ae7ac1415ab671d1ce8e302e74f734ae3834c47f08889711b426f302e10b: Status 404 returned error can't find the container with id 86e7ae7ac1415ab671d1ce8e302e74f734ae3834c47f08889711b426f302e10b Apr 17 16:30:55.456174 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.456128 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65e080a5_8430_43f5_b120_a3fff8102219.slice/crio-3ccba31de04de995516edf99a7326c17fd8162428799c5bee0a5a2a6ba9b4ff6 WatchSource:0}: Error finding container 3ccba31de04de995516edf99a7326c17fd8162428799c5bee0a5a2a6ba9b4ff6: Status 404 returned error can't find the container with id 3ccba31de04de995516edf99a7326c17fd8162428799c5bee0a5a2a6ba9b4ff6 Apr 17 16:30:55.459569 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.459540 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46096eec_9a89_49c4_a719_dad5e2d71c2f.slice/crio-6ecb7f34e81087dbb73072f03eeab0cc4ce010530e70703ddf365af8bf4802af WatchSource:0}: Error finding container 6ecb7f34e81087dbb73072f03eeab0cc4ce010530e70703ddf365af8bf4802af: Status 404 returned error can't find the container with id 6ecb7f34e81087dbb73072f03eeab0cc4ce010530e70703ddf365af8bf4802af Apr 17 16:30:55.480728 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.480701 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0164bdb_ef92_4743_91fc_03f010abe474.slice/crio-0a2a36ab338d8b884cb8c578fab9c2aa19228db4ef64240df3e4ee9460194101 WatchSource:0}: Error finding container 0a2a36ab338d8b884cb8c578fab9c2aa19228db4ef64240df3e4ee9460194101: Status 404 returned error can't find the container with id 0a2a36ab338d8b884cb8c578fab9c2aa19228db4ef64240df3e4ee9460194101 Apr 17 16:30:55.481462 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.481440 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf962288b_adc2_4e12_9bcf_6ebff0796c4d.slice/crio-277176402c502e143e1f60bf67fa9d33085a48b53f7dcd85e7b991aeb5c90f49 WatchSource:0}: Error finding container 277176402c502e143e1f60bf67fa9d33085a48b53f7dcd85e7b991aeb5c90f49: Status 404 returned error can't find the container with id 277176402c502e143e1f60bf67fa9d33085a48b53f7dcd85e7b991aeb5c90f49 Apr 17 16:30:55.482333 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.482305 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1e0a80_e552_4d28_b66e_b268268577f9.slice/crio-e7167751c7c2bbf0869370b8107b4a63d38fef761f86b092ab6954a835fa2bc6 WatchSource:0}: Error finding container e7167751c7c2bbf0869370b8107b4a63d38fef761f86b092ab6954a835fa2bc6: Status 404 returned error can't find the container with id e7167751c7c2bbf0869370b8107b4a63d38fef761f86b092ab6954a835fa2bc6 Apr 17 16:30:55.483390 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.483158 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c7418f_4df1_4f6f_ac48_8530556a0db1.slice/crio-4d9bbf0f343433e8990779b3282d3b593a828730944d4495b85f29968ae07116 WatchSource:0}: Error finding container 4d9bbf0f343433e8990779b3282d3b593a828730944d4495b85f29968ae07116: Status 404 returned error can't find the container with id 4d9bbf0f343433e8990779b3282d3b593a828730944d4495b85f29968ae07116 Apr 17 16:30:55.484655 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:30:55.484615 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fbceac7_0307_49d3_8986_e1b49a4b6760.slice/crio-21461ca537f81aa55e06edb4ea51c5c8a72fa2b94c76432cc68bc4b5c985dd69 WatchSource:0}: Error finding container 21461ca537f81aa55e06edb4ea51c5c8a72fa2b94c76432cc68bc4b5c985dd69: Status 404 returned error can't find the container with id 21461ca537f81aa55e06edb4ea51c5c8a72fa2b94c76432cc68bc4b5c985dd69 Apr 17 16:30:55.499173 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.499147 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:25:53 +0000 UTC" deadline="2028-01-14 12:09:05.847374376 +0000 UTC" Apr 17 16:30:55.499173 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.499170 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15283h38m10.348205965s" Apr 17 16:30:55.527504 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.527477 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerStarted","Data":"21461ca537f81aa55e06edb4ea51c5c8a72fa2b94c76432cc68bc4b5c985dd69"} Apr 17 16:30:55.528316 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.528296 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gxtn5" event={"ID":"9e1e0a80-e552-4d28-b66e-b268268577f9","Type":"ContainerStarted","Data":"e7167751c7c2bbf0869370b8107b4a63d38fef761f86b092ab6954a835fa2bc6"} Apr 17 16:30:55.529122 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.529100 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" event={"ID":"f962288b-adc2-4e12-9bcf-6ebff0796c4d","Type":"ContainerStarted","Data":"277176402c502e143e1f60bf67fa9d33085a48b53f7dcd85e7b991aeb5c90f49"} Apr 17 16:30:55.529980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.529959 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"0a2a36ab338d8b884cb8c578fab9c2aa19228db4ef64240df3e4ee9460194101"} Apr 17 16:30:55.530871 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.530843 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hvbjq" event={"ID":"46096eec-9a89-49c4-a719-dad5e2d71c2f","Type":"ContainerStarted","Data":"6ecb7f34e81087dbb73072f03eeab0cc4ce010530e70703ddf365af8bf4802af"} Apr 17 16:30:55.531669 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.531653 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lpfgv" event={"ID":"65e080a5-8430-43f5-b120-a3fff8102219","Type":"ContainerStarted","Data":"3ccba31de04de995516edf99a7326c17fd8162428799c5bee0a5a2a6ba9b4ff6"} Apr 17 16:30:55.532475 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.532458 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r45ll" event={"ID":"74c7418f-4df1-4f6f-ac48-8530556a0db1","Type":"ContainerStarted","Data":"4d9bbf0f343433e8990779b3282d3b593a828730944d4495b85f29968ae07116"} Apr 17 16:30:55.533236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.533220 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-chsm2" event={"ID":"05559253-f52c-49e6-a8e0-1751350669ac","Type":"ContainerStarted","Data":"86e7ae7ac1415ab671d1ce8e302e74f734ae3834c47f08889711b426f302e10b"} Apr 17 16:30:55.534154 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.534127 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgvnr" event={"ID":"c3d773c7-2b0f-4dff-a3ec-72f61e88111c","Type":"ContainerStarted","Data":"b531dc4dd8b926403596d02295c6574e0ff9b2bc86b001a0cc0d336e96c84fdb"} Apr 17 16:30:55.672969 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:55.672942 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:30:56.083796 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.083675 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:56.083964 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.083842 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:56.083964 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.083914 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:30:58.083893252 +0000 UTC m=+5.981654690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:56.185297 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.184514 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:56.185297 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.184711 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:30:56.185297 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.184729 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:30:56.185297 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.184741 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:56.185297 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.184804 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:30:58.184785053 +0000 UTC m=+6.082546509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:56.520566 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.520491 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:56.520989 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.520627 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:30:56.520989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.520920 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:56.521182 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:56.521017 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:30:56.554491 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.554450 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" event={"ID":"d11f5615e8652fc4b9c2e3bdd1d65617","Type":"ContainerStarted","Data":"0e682637e6ba4cbe0de9634a609ad445af944abc550c49d8eca8b6d0b68bc28e"} Apr 17 16:30:56.571991 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.571923 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-35.ec2.internal" podStartSLOduration=3.571905612 podStartE2EDuration="3.571905612s" podCreationTimestamp="2026-04-17 16:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:30:56.571625645 +0000 UTC m=+4.469387100" watchObservedRunningTime="2026-04-17 16:30:56.571905612 +0000 UTC m=+4.469667067" Apr 17 16:30:56.574489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.574395 2561 generic.go:358] "Generic (PLEG): container finished" podID="1918300f3765a6a348983233f548f4f5" containerID="129b322fcafb831d177b8e530ab98e5bde8910a890439a17ed20a690bde42f81" exitCode=0 Apr 17 16:30:56.574489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:56.574456 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" event={"ID":"1918300f3765a6a348983233f548f4f5","Type":"ContainerDied","Data":"129b322fcafb831d177b8e530ab98e5bde8910a890439a17ed20a690bde42f81"} Apr 17 16:30:57.611740 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:57.611680 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" event={"ID":"1918300f3765a6a348983233f548f4f5","Type":"ContainerStarted","Data":"4a3d49d76478b731d0067b89a25d1d02ea0843b30e38941122276ec2db26818a"} Apr 17 16:30:58.108112 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:58.107423 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:58.108112 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.107603 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:58.108112 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.107669 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:02.107650848 +0000 UTC m=+10.005412287 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:30:58.209529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:58.208867 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:58.209529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.209047 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:30:58.209529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.209067 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:30:58.209529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.209099 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:58.209529 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.209171 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:31:02.209150851 +0000 UTC m=+10.106912290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:30:58.518858 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:58.518362 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:30:58.518858 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.518505 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:30:58.519093 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:30:58.518929 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:30:58.519093 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:30:58.519022 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:00.518627 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:00.518597 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:00.519096 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:00.518640 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:00.519096 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:00.518779 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:00.519096 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:00.518998 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:02.141460 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:02.140824 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:02.141460 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.141010 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:02.141460 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.141091 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:10.141055434 +0000 UTC m=+18.038816885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:02.242178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:02.242130 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:02.242363 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.242348 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:02.242418 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.242367 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:02.242418 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.242380 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:02.242512 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.242444 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:31:10.242426193 +0000 UTC m=+18.140187641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:02.519108 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:02.518851 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:02.519108 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:02.518851 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:02.523799 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.523117 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:02.523799 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:02.523295 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:04.518763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:04.518730 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:04.519242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:04.518745 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:04.519242 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:04.518882 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:04.519242 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:04.519018 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:06.518178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:06.518131 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:06.518178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:06.518162 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:06.518692 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:06.518282 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:06.518692 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:06.518405 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:08.519041 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:08.518821 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:08.519494 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:08.518824 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:08.519494 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:08.519162 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:08.519494 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:08.519309 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:10.195543 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:10.195502 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:10.195986 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.195667 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:10.195986 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.195754 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.195731824 +0000 UTC m=+34.093493277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:10.296127 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:10.296064 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:10.296312 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.296246 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:10.296312 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.296271 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:10.296312 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.296284 2561 projected.go:194] Error preparing data for projected volume kube-api-access-nwv9w for pod openshift-network-diagnostics/network-check-target-rhztb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:10.296459 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.296350 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w podName:126fa63b-6174-4d95-bf2c-01daa7a91ccf nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.29633104 +0000 UTC m=+34.194092490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwv9w" (UniqueName: "kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w") pod "network-check-target-rhztb" (UID: "126fa63b-6174-4d95-bf2c-01daa7a91ccf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:10.518982 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:10.518896 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:10.519190 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.519026 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:10.519190 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:10.519098 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:10.519303 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:10.519218 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:12.519129 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:12.518611 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:12.519129 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:12.518731 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:12.519129 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:12.518783 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:12.519129 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:12.518844 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:12.639319 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:12.639284 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-chsm2" event={"ID":"05559253-f52c-49e6-a8e0-1751350669ac","Type":"ContainerStarted","Data":"4ec2b9e936158edf51d9f1855ddda44d93c064c2b3e4677ab01d18fae16aec32"} Apr 17 16:31:12.658497 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:12.658452 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-35.ec2.internal" podStartSLOduration=19.658439525 podStartE2EDuration="19.658439525s" podCreationTimestamp="2026-04-17 16:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:30:57.625865572 +0000 UTC m=+5.523627044" watchObservedRunningTime="2026-04-17 16:31:12.658439525 +0000 UTC m=+20.556200981" Apr 17 16:31:12.658668 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:12.658649 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-chsm2" podStartSLOduration=11.686941106999999 podStartE2EDuration="20.658645453s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.458299902 +0000 UTC m=+3.356061359" lastFinishedPulling="2026-04-17 16:31:04.430004252 +0000 UTC m=+12.327765705" observedRunningTime="2026-04-17 16:31:12.658273966 +0000 UTC m=+20.556035421" watchObservedRunningTime="2026-04-17 16:31:12.658645453 +0000 UTC m=+20.556406908" Apr 17 16:31:13.642536 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.642306 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" event={"ID":"f962288b-adc2-4e12-9bcf-6ebff0796c4d","Type":"ContainerStarted","Data":"a1206424035c11fc42e4dc8c200be8a58d97ec93dd216c706e9d814f6f13b84e"} Apr 17 16:31:13.645332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645275 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"a03ca8a042af3b3c97c0fe49623ef33b34efed9f0c62f000189c1fbb3e0656a3"} Apr 17 16:31:13.645332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645313 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"b7444a5ae84d7fb0dc38cd25992826484ae3d9a367c24910610c6146295cb4ad"} Apr 17 16:31:13.645332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645327 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"519c1905f8608c945b19972e43ac81d8585ffaee142a890740789085fa8ea441"} Apr 17 16:31:13.645592 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645338 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"ad6101e0eda02d73a52e979ba14b622eecb3789d5699fe9e7b40a7f313a51f3e"} Apr 17 16:31:13.645592 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645381 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"897232ac3c9b36d90e1fb73ef6cb7688b602c362f478d5083bb73da909a7062e"} Apr 17 16:31:13.645592 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.645396 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"7543dc2f4acf1d3e60008f258e07966e546dd30d6edbb528b5aabec3d98c0d92"} Apr 17 16:31:13.646607 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.646580 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hvbjq" event={"ID":"46096eec-9a89-49c4-a719-dad5e2d71c2f","Type":"ContainerStarted","Data":"dd2e6cac91ef18b7d15dd254f6a594655f165f25a1a944d89acdcd10059f2300"} Apr 17 16:31:13.647935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.647912 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lpfgv" event={"ID":"65e080a5-8430-43f5-b120-a3fff8102219","Type":"ContainerStarted","Data":"5a2b79a188fcb3209660da5b465ef29220d8a0b6c1c77707f8bc3d4e330c4336"} Apr 17 16:31:13.649184 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.649157 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r45ll" event={"ID":"74c7418f-4df1-4f6f-ac48-8530556a0db1","Type":"ContainerStarted","Data":"a0d7f4c68569caffc46cb9dee877d0f880fc3472f79cbc3d291e050201f7e26a"} Apr 17 16:31:13.650560 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.650537 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgvnr" event={"ID":"c3d773c7-2b0f-4dff-a3ec-72f61e88111c","Type":"ContainerStarted","Data":"ac13000e6b94b49b6e9f6a6ee3999eef5bda32e2d74c130e1555a809fa5ebf7e"} Apr 17 16:31:13.651911 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.651890 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="376393a4decb79426189edffb7848568fe39e216073add1e896df73a51ca0b16" exitCode=0 Apr 17 16:31:13.652014 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.651995 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"376393a4decb79426189edffb7848568fe39e216073add1e896df73a51ca0b16"} Apr 17 16:31:13.661804 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.661764 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hvbjq" podStartSLOduration=12.708095857 podStartE2EDuration="21.661751928s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.479025268 +0000 UTC m=+3.376786701" lastFinishedPulling="2026-04-17 16:31:04.432681323 +0000 UTC m=+12.330442772" observedRunningTime="2026-04-17 16:31:13.661124976 +0000 UTC m=+21.558886431" watchObservedRunningTime="2026-04-17 16:31:13.661751928 +0000 UTC m=+21.559513383" Apr 17 16:31:13.675655 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.675616 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lpfgv" podStartSLOduration=12.698575572 podStartE2EDuration="21.675604412s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.458628204 +0000 UTC m=+3.356389649" lastFinishedPulling="2026-04-17 16:31:04.435657057 +0000 UTC m=+12.333418489" observedRunningTime="2026-04-17 16:31:13.675377388 +0000 UTC m=+21.573138843" watchObservedRunningTime="2026-04-17 16:31:13.675604412 +0000 UTC m=+21.573365865" Apr 17 16:31:13.712590 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.712534 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r45ll" podStartSLOduration=4.527650266 podStartE2EDuration="21.712517421s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.48743526 +0000 UTC m=+3.385196693" lastFinishedPulling="2026-04-17 16:31:12.672302415 +0000 UTC m=+20.570063848" observedRunningTime="2026-04-17 16:31:13.711736727 +0000 UTC m=+21.609498183" watchObservedRunningTime="2026-04-17 16:31:13.712517421 +0000 UTC m=+21.610278877" Apr 17 16:31:13.728142 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:13.728067 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wgvnr" podStartSLOduration=4.452170111 podStartE2EDuration="21.728050807s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.456257313 +0000 UTC m=+3.354018761" lastFinishedPulling="2026-04-17 16:31:12.732138008 +0000 UTC m=+20.629899457" observedRunningTime="2026-04-17 16:31:13.727183406 +0000 UTC m=+21.624944860" watchObservedRunningTime="2026-04-17 16:31:13.728050807 +0000 UTC m=+21.625812261" Apr 17 16:31:14.068410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.068250 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:14.518504 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.518471 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:14.518792 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:14.518580 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:14.518792 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.518640 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:14.518792 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:14.518731 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:14.533967 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.533876 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:14.068406182Z","UUID":"348b1a4e-e447-48e0-9e52-6fce877e71a6","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:14.536707 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.536689 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:14.536834 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.536717 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:14.655958 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.655921 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gxtn5" event={"ID":"9e1e0a80-e552-4d28-b66e-b268268577f9","Type":"ContainerStarted","Data":"4f139f55fd9c0a1532bf16015d5a0cfc38c0ad73abb11fed3f54948387bf30e5"} Apr 17 16:31:14.657930 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.657902 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" event={"ID":"f962288b-adc2-4e12-9bcf-6ebff0796c4d","Type":"ContainerStarted","Data":"21f259a210e1d65754fc3a932c51a443ba73fd6ca7c49b70fbad8e62fe0f56cf"} Apr 17 16:31:14.681110 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:14.681033 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gxtn5" podStartSLOduration=13.735202374 podStartE2EDuration="22.681013792s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.487235688 +0000 UTC m=+3.384997121" lastFinishedPulling="2026-04-17 16:31:04.433047091 +0000 UTC m=+12.330808539" observedRunningTime="2026-04-17 16:31:14.680379075 +0000 UTC m=+22.578140531" watchObservedRunningTime="2026-04-17 16:31:14.681013792 +0000 UTC m=+22.578775248" Apr 17 16:31:15.661744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:15.661702 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" event={"ID":"f962288b-adc2-4e12-9bcf-6ebff0796c4d","Type":"ContainerStarted","Data":"bef0931951dca7b3f8a6b5ca26a49b41ef56d60d89a3e0295c5515112a765344"} Apr 17 16:31:15.666763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:15.666725 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"00b1a2e09348b183537e19610ad985a04017926dd2d201ec73fd703c9c14c0ff"} Apr 17 16:31:16.518272 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:16.518242 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:16.518453 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:16.518242 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:16.518453 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:16.518353 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:16.518453 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:16.518440 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:16.960546 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:16.960501 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:31:16.961252 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:16.961232 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:31:16.975870 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:16.975824 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-66g2k" podStartSLOduration=4.479350688 podStartE2EDuration="23.975810789s" podCreationTimestamp="2026-04-17 16:30:53 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.487503795 +0000 UTC m=+3.385265234" lastFinishedPulling="2026-04-17 16:31:14.983963899 +0000 UTC m=+22.881725335" observedRunningTime="2026-04-17 16:31:15.687542648 +0000 UTC m=+23.585304102" watchObservedRunningTime="2026-04-17 16:31:16.975810789 +0000 UTC m=+24.873572274" Apr 17 16:31:17.317936 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:17.317888 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:31:17.318505 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:17.318487 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hvbjq" Apr 17 16:31:17.675617 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:17.675404 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" event={"ID":"f0164bdb-ef92-4743-91fc-03f010abe474","Type":"ContainerStarted","Data":"2f99b3f688777f90175ff3ce0bd7aae59893448562b47b76700236467bcfb088"} Apr 17 16:31:17.705686 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:17.705598 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" podStartSLOduration=8.412863275 podStartE2EDuration="25.705577639s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.482722461 +0000 UTC m=+3.380483894" lastFinishedPulling="2026-04-17 16:31:12.775436822 +0000 UTC m=+20.673198258" observedRunningTime="2026-04-17 16:31:17.704675897 +0000 UTC m=+25.602437392" watchObservedRunningTime="2026-04-17 16:31:17.705577639 +0000 UTC m=+25.603339096" Apr 17 16:31:18.143256 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.143217 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gpb6x"] Apr 17 16:31:18.147947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.147921 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.148101 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.148004 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gpb6x" podUID="b94229cb-1bfa-4a97-b64a-bde473f3d9e0" Apr 17 16:31:18.254039 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.254006 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-dbus\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.254039 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.254042 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-kubelet-config\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.254292 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.254065 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.355481 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.355438 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-dbus\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.355481 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.355485 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-kubelet-config\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.355708 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.355512 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.355708 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.355618 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-kubelet-config\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.355708 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.355644 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:18.355842 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.355712 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret podName:b94229cb-1bfa-4a97-b64a-bde473f3d9e0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:18.855692195 +0000 UTC m=+26.753453660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret") pod "global-pull-secret-syncer-gpb6x" (UID: "b94229cb-1bfa-4a97-b64a-bde473f3d9e0") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:18.355842 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.355725 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-dbus\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.518431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.518342 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:18.518431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.518383 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:18.518651 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.518476 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:18.518651 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.518599 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:18.677486 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.677441 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:18.677486 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.677480 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:18.677486 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.677494 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:18.693491 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.693462 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:18.693688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.693672 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:18.860136 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:18.859549 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:18.860136 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.859695 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:18.860136 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:18.859754 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret podName:b94229cb-1bfa-4a97-b64a-bde473f3d9e0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:19.859735161 +0000 UTC m=+27.757496607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret") pod "global-pull-secret-syncer-gpb6x" (UID: "b94229cb-1bfa-4a97-b64a-bde473f3d9e0") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:19.518084 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.518044 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:19.518542 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.518175 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gpb6x" podUID="b94229cb-1bfa-4a97-b64a-bde473f3d9e0" Apr 17 16:31:19.764706 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.764669 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gpb6x"] Apr 17 16:31:19.764866 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.764785 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:19.764920 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.764894 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gpb6x" podUID="b94229cb-1bfa-4a97-b64a-bde473f3d9e0" Apr 17 16:31:19.768438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.768367 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdt9k"] Apr 17 16:31:19.768573 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.768487 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:19.768633 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.768585 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:19.768961 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.768939 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rhztb"] Apr 17 16:31:19.769056 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.769042 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:19.769170 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.769154 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:19.867950 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:19.867914 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:19.868109 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.868031 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:19.868154 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:19.868110 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret podName:b94229cb-1bfa-4a97-b64a-bde473f3d9e0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.868091047 +0000 UTC m=+29.765852496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret") pod "global-pull-secret-syncer-gpb6x" (UID: "b94229cb-1bfa-4a97-b64a-bde473f3d9e0") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:20.681481 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:20.681445 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="4a110fadc3e0381a353c7efd5320809859d86374b1e66d6b150da6324afed76f" exitCode=0 Apr 17 16:31:20.681923 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:20.681507 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"4a110fadc3e0381a353c7efd5320809859d86374b1e66d6b150da6324afed76f"} Apr 17 16:31:21.518994 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:21.518792 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:21.519171 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:21.518862 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:21.519171 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:21.519125 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gpb6x" podUID="b94229cb-1bfa-4a97-b64a-bde473f3d9e0" Apr 17 16:31:21.519299 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:21.519178 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:21.519299 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:21.518893 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:21.519299 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:21.519285 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:21.884278 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:21.884242 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:21.884667 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:21.884404 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:21.884667 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:21.884471 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret podName:b94229cb-1bfa-4a97-b64a-bde473f3d9e0 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:25.884456078 +0000 UTC m=+33.782217517 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret") pod "global-pull-secret-syncer-gpb6x" (UID: "b94229cb-1bfa-4a97-b64a-bde473f3d9e0") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:22.687038 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:22.687008 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="c3dd9698d96b4a749701359c58ed248551bbba7aef6fc91d69ca82c67bfe4f5a" exitCode=0 Apr 17 16:31:22.687233 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:22.687085 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"c3dd9698d96b4a749701359c58ed248551bbba7aef6fc91d69ca82c67bfe4f5a"} Apr 17 16:31:23.517927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:23.517892 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:23.517927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:23.517923 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:23.518553 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:23.518001 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gpb6x" podUID="b94229cb-1bfa-4a97-b64a-bde473f3d9e0" Apr 17 16:31:23.518553 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:23.518043 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:23.518553 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:23.518153 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:31:23.518553 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:23.518212 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-rhztb" podUID="126fa63b-6174-4d95-bf2c-01daa7a91ccf" Apr 17 16:31:24.693306 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:24.693271 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="f5873f73a939a120ab37f3ce876be4c5a5a9cc2404a9d3d6f5a6369cbc96b21d" exitCode=0 Apr 17 16:31:24.693659 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:24.693334 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"f5873f73a939a120ab37f3ce876be4c5a5a9cc2404a9d3d6f5a6369cbc96b21d"} Apr 17 16:31:25.456472 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.456443 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-35.ec2.internal" event="NodeReady" Apr 17 16:31:25.456637 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.456597 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:25.493726 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.493687 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq"] Apr 17 16:31:25.509015 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.508972 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:31:25.509379 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.509350 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.512386 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.512303 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 16:31:25.512531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.512452 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:31:25.512876 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.512712 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 16:31:25.512876 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.512770 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.513157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.513135 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 16:31:25.513238 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.513156 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.513238 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.513167 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 16:31:25.530260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.530231 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v"] Apr 17 16:31:25.530420 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.530369 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:25.530420 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.530379 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:25.530420 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.530387 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.530591 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.530571 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:25.534541 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.534507 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:25.534879 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.534854 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:31:25.535096 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535058 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.535426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535209 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:31:25.535426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535223 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6hp7\"" Apr 17 16:31:25.535601 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535438 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:31:25.535601 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535504 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:25.535601 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535535 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:31:25.535830 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.535810 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.536204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.536140 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tvdp2\"" Apr 17 16:31:25.551918 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.551892 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:31:25.552227 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.552209 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4"] Apr 17 16:31:25.552383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.552367 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.554777 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.554759 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 16:31:25.572913 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.572888 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hcv89"] Apr 17 16:31:25.573045 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.573028 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.575367 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.575348 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 16:31:25.575465 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.575352 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-l8d7r\"" Apr 17 16:31:25.592820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.592792 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hhlb4"] Apr 17 16:31:25.592954 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.592848 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.595403 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.595375 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:31:25.595518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.595375 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:25.595518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.595439 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:25.595518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.595451 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:25.607384 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607365 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq"] Apr 17 16:31:25.607489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607406 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4"] Apr 17 16:31:25.607489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607421 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v"] Apr 17 16:31:25.607489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607435 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:31:25.607489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607447 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hcv89"] Apr 17 16:31:25.607489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607456 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhlb4"] Apr 17 16:31:25.607652 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.607535 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.611271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.611247 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:25.611271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.611247 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:25.611436 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.611364 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:31:25.613258 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613239 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613270 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.613358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613292 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66mq\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613314 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613340 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.613562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613410 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/3ef41909-613b-41f9-9401-d08531c9d28c-kube-api-access-jkl48\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.613562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613451 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613484 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613511 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613536 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3ef41909-613b-41f9-9401-d08531c9d28c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.613816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613566 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.613816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613631 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613648 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.613816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.613661 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.714828 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.714798 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.714858 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/3ef41909-613b-41f9-9401-d08531c9d28c-kube-api-access-jkl48\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.714940 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl465\" (UniqueName: \"kubernetes.io/projected/c93f7728-334c-48f1-9627-2bec5d533ccb-kube-api-access-pl465\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.714989 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715021 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715045 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715089 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3ef41909-613b-41f9-9401-d08531c9d28c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715116 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.715133 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.715154 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715175 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c93f7728-334c-48f1-9627-2bec5d533ccb-tmp\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.715227 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.215206464 +0000 UTC m=+34.112967910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715300 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715336 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715363 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715397 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsq6l\" (UniqueName: \"kubernetes.io/projected/947405ad-c2f6-4581-b056-296308a2cc2f-kube-api-access-qsq6l\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.715450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715425 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85addf4e-0ee6-43f5-a06c-f450323824f1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715453 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsclv\" (UniqueName: \"kubernetes.io/projected/85addf4e-0ee6-43f5-a06c-f450323824f1-kube-api-access-dsclv\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715487 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715514 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715554 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715580 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be864e9c-2445-4ad3-8453-a28d5bd5fda2-config-volume\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715612 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b66mq\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715639 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c93f7728-334c-48f1-9627-2bec5d533ccb-klusterlet-config\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715673 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715702 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715739 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be864e9c-2445-4ad3-8453-a28d5bd5fda2-tmp-dir\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715892 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715903 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/3ef41909-613b-41f9-9401-d08531c9d28c-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.716240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.715964 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlms\" (UniqueName: \"kubernetes.io/projected/be864e9c-2445-4ad3-8453-a28d5bd5fda2-kube-api-access-2wlms\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.716825 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.716803 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.720713 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.720687 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-ca\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.720842 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.720687 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.721067 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.721043 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.723817 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.723793 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.724173 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.724145 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.724267 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.724244 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66mq\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:25.730267 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.730245 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.730383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.730246 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.738463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.738438 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3ef41909-613b-41f9-9401-d08531c9d28c-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.740560 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.740537 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/3ef41909-613b-41f9-9401-d08531c9d28c-kube-api-access-jkl48\") pod \"cluster-proxy-proxy-agent-74b858c94b-jjklq\" (UID: \"3ef41909-613b-41f9-9401-d08531c9d28c\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.817170 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817129 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsq6l\" (UniqueName: \"kubernetes.io/projected/947405ad-c2f6-4581-b056-296308a2cc2f-kube-api-access-qsq6l\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.817170 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817169 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85addf4e-0ee6-43f5-a06c-f450323824f1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817188 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsclv\" (UniqueName: \"kubernetes.io/projected/85addf4e-0ee6-43f5-a06c-f450323824f1-kube-api-access-dsclv\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817211 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817248 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be864e9c-2445-4ad3-8453-a28d5bd5fda2-config-volume\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817279 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c93f7728-334c-48f1-9627-2bec5d533ccb-klusterlet-config\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817306 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be864e9c-2445-4ad3-8453-a28d5bd5fda2-tmp-dir\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.817407 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.817360 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817413 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlms\" (UniqueName: \"kubernetes.io/projected/be864e9c-2445-4ad3-8453-a28d5bd5fda2-kube-api-access-2wlms\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.817443 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.317419974 +0000 UTC m=+34.215181411 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817486 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817539 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl465\" (UniqueName: \"kubernetes.io/projected/c93f7728-334c-48f1-9627-2bec5d533ccb-kube-api-access-pl465\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817613 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c93f7728-334c-48f1-9627-2bec5d533ccb-tmp\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817643 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/be864e9c-2445-4ad3-8453-a28d5bd5fda2-tmp-dir\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.817696 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.817660 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:25.818031 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:25.817726 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:26.317713391 +0000 UTC m=+34.215474828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:25.818031 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817898 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be864e9c-2445-4ad3-8453-a28d5bd5fda2-config-volume\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.818031 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.817984 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c93f7728-334c-48f1-9627-2bec5d533ccb-tmp\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.819753 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.819734 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/85addf4e-0ee6-43f5-a06c-f450323824f1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.819817 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.819759 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c93f7728-334c-48f1-9627-2bec5d533ccb-klusterlet-config\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.828416 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.828382 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsclv\" (UniqueName: \"kubernetes.io/projected/85addf4e-0ee6-43f5-a06c-f450323824f1-kube-api-access-dsclv\") pod \"managed-serviceaccount-addon-agent-764867bc4f-btcv4\" (UID: \"85addf4e-0ee6-43f5-a06c-f450323824f1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.828522 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.828421 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlms\" (UniqueName: \"kubernetes.io/projected/be864e9c-2445-4ad3-8453-a28d5bd5fda2-kube-api-access-2wlms\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:25.828848 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.828831 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsq6l\" (UniqueName: \"kubernetes.io/projected/947405ad-c2f6-4581-b056-296308a2cc2f-kube-api-access-qsq6l\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:25.828928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.828913 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl465\" (UniqueName: \"kubernetes.io/projected/c93f7728-334c-48f1-9627-2bec5d533ccb-kube-api-access-pl465\") pod \"klusterlet-addon-workmgr-64bc78d648-wrd2v\" (UID: \"c93f7728-334c-48f1-9627-2bec5d533ccb\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.832840 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.832782 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:31:25.875761 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.874345 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:25.883291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.883265 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" Apr 17 16:31:25.918624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.918582 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:25.921374 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:25.921346 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b94229cb-1bfa-4a97-b64a-bde473f3d9e0-original-pull-secret\") pod \"global-pull-secret-syncer-gpb6x\" (UID: \"b94229cb-1bfa-4a97-b64a-bde473f3d9e0\") " pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:26.044590 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.044560 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v"] Apr 17 16:31:26.047628 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.047603 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq"] Apr 17 16:31:26.049320 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:31:26.049266 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc93f7728_334c_48f1_9627_2bec5d533ccb.slice/crio-08ea0a64a864a843446d0a126df9d12995253f94b770df7606c20e813ec427a8 WatchSource:0}: Error finding container 08ea0a64a864a843446d0a126df9d12995253f94b770df7606c20e813ec427a8: Status 404 returned error can't find the container with id 08ea0a64a864a843446d0a126df9d12995253f94b770df7606c20e813ec427a8 Apr 17 16:31:26.050841 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:31:26.050817 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ef41909_613b_41f9_9401_d08531c9d28c.slice/crio-489c09a9eb717df6835d87a1fe3316e75f01f092accc36c4ff874f6ee5618487 WatchSource:0}: Error finding container 489c09a9eb717df6835d87a1fe3316e75f01f092accc36c4ff874f6ee5618487: Status 404 returned error can't find the container with id 489c09a9eb717df6835d87a1fe3316e75f01f092accc36c4ff874f6ee5618487 Apr 17 16:31:26.072321 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.072272 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4"] Apr 17 16:31:26.075290 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:31:26.075249 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85addf4e_0ee6_43f5_a06c_f450323824f1.slice/crio-85997a34d3da6566893fe8d97ca6c5d992952f9b142b498a7a7de940ad7dcf89 WatchSource:0}: Error finding container 85997a34d3da6566893fe8d97ca6c5d992952f9b142b498a7a7de940ad7dcf89: Status 404 returned error can't find the container with id 85997a34d3da6566893fe8d97ca6c5d992952f9b142b498a7a7de940ad7dcf89 Apr 17 16:31:26.152826 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.152697 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gpb6x" Apr 17 16:31:26.221412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.221313 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:26.221412 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.221349 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:26.221608 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.221518 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:26.221608 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.221533 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:31:26.221608 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.221538 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:26.221608 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.221592 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:58.221572949 +0000 UTC m=+66.119334405 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : secret "metrics-daemon-secret" not found Apr 17 16:31:26.221608 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.221609 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.22160025 +0000 UTC m=+35.119361686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:26.279781 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.279748 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gpb6x"] Apr 17 16:31:26.292496 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:31:26.292464 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94229cb_1bfa_4a97_b64a_bde473f3d9e0.slice/crio-d1c2ab3d03bb80e31229bbf692e79a96d183c0616e3e38880883b4c7e4dea9b9 WatchSource:0}: Error finding container d1c2ab3d03bb80e31229bbf692e79a96d183c0616e3e38880883b4c7e4dea9b9: Status 404 returned error can't find the container with id d1c2ab3d03bb80e31229bbf692e79a96d183c0616e3e38880883b4c7e4dea9b9 Apr 17 16:31:26.321874 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.321840 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:26.322061 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.321893 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:26.322061 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.321945 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:26.322061 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.322053 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:26.322243 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.322158 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.32213618 +0000 UTC m=+35.219897630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:26.322243 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.322056 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:26.322243 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:26.322239 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:27.322218512 +0000 UTC m=+35.219979959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:26.325759 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.325732 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwv9w\" (UniqueName: \"kubernetes.io/projected/126fa63b-6174-4d95-bf2c-01daa7a91ccf-kube-api-access-nwv9w\") pod \"network-check-target-rhztb\" (UID: \"126fa63b-6174-4d95-bf2c-01daa7a91ccf\") " pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:26.468541 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.468448 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:26.590744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.590710 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-rhztb"] Apr 17 16:31:26.593771 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:31:26.593742 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126fa63b_6174_4d95_bf2c_01daa7a91ccf.slice/crio-e5e83f153504272e512424a4f55079555348d398eb6cdc3665bbb7a42ca1a592 WatchSource:0}: Error finding container e5e83f153504272e512424a4f55079555348d398eb6cdc3665bbb7a42ca1a592: Status 404 returned error can't find the container with id e5e83f153504272e512424a4f55079555348d398eb6cdc3665bbb7a42ca1a592 Apr 17 16:31:26.699202 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.699157 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerStarted","Data":"489c09a9eb717df6835d87a1fe3316e75f01f092accc36c4ff874f6ee5618487"} Apr 17 16:31:26.700186 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.700154 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" event={"ID":"c93f7728-334c-48f1-9627-2bec5d533ccb","Type":"ContainerStarted","Data":"08ea0a64a864a843446d0a126df9d12995253f94b770df7606c20e813ec427a8"} Apr 17 16:31:26.701312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.701291 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rhztb" event={"ID":"126fa63b-6174-4d95-bf2c-01daa7a91ccf","Type":"ContainerStarted","Data":"e5e83f153504272e512424a4f55079555348d398eb6cdc3665bbb7a42ca1a592"} Apr 17 16:31:26.702388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.702361 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gpb6x" event={"ID":"b94229cb-1bfa-4a97-b64a-bde473f3d9e0","Type":"ContainerStarted","Data":"d1c2ab3d03bb80e31229bbf692e79a96d183c0616e3e38880883b4c7e4dea9b9"} Apr 17 16:31:26.703464 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:26.703442 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" event={"ID":"85addf4e-0ee6-43f5-a06c-f450323824f1","Type":"ContainerStarted","Data":"85997a34d3da6566893fe8d97ca6c5d992952f9b142b498a7a7de940ad7dcf89"} Apr 17 16:31:27.230382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:27.230342 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:27.230857 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.230596 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:27.230857 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.230616 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:27.230857 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.230681 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:29.230658909 +0000 UTC m=+37.128420357 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:27.331547 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:27.331657 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.331782 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.331881 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:29.331860907 +0000 UTC m=+37.229622354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.332362 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:27.332454 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:27.332414 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:29.332398718 +0000 UTC m=+37.230160168 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:29.249524 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:29.249478 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:29.249977 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.249666 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:29.249977 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.249683 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:29.249977 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.249745 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.249725822 +0000 UTC m=+41.147487262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:29.350107 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:29.350187 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.350336 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.350398 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.350378679 +0000 UTC m=+41.248140129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.350790 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:29.350914 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:29.350832 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.350818021 +0000 UTC m=+41.248579457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:33.289990 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:33.289944 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:33.290572 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.290139 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:33.290572 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.290163 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:33.290572 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.290239 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.290216287 +0000 UTC m=+49.187977731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:33.390596 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:33.390555 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:33.390762 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:33.390646 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:33.393966 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.391117 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:33.393966 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.391206 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.391185414 +0000 UTC m=+49.288946864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:33.393966 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.391818 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:33.393966 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:33.391906 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:41.391875806 +0000 UTC m=+49.289637255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:39.740268 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.740217 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-rhztb" event={"ID":"126fa63b-6174-4d95-bf2c-01daa7a91ccf","Type":"ContainerStarted","Data":"69b18911c6bcce8fd7fc00dbf21a6c211a74160cd383f96ae446415bfd490a14"} Apr 17 16:31:39.740716 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.740462 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:31:39.741798 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.741768 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gpb6x" event={"ID":"b94229cb-1bfa-4a97-b64a-bde473f3d9e0","Type":"ContainerStarted","Data":"a082f6e5c9b391e5a2ef533d3c9e34a676a198e4f5419f68211368833878d2cf"} Apr 17 16:31:39.743120 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.743095 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" event={"ID":"85addf4e-0ee6-43f5-a06c-f450323824f1","Type":"ContainerStarted","Data":"db29927fa15efc55e66162e7cec232b1363b1f1598cfe59dfdf10a2addfcd3a4"} Apr 17 16:31:39.745769 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.745743 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="eb651206395a5561f5da85ab64fa215338d2683e9ac1eaae6046b027ff9b0009" exitCode=0 Apr 17 16:31:39.745864 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.745825 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"eb651206395a5561f5da85ab64fa215338d2683e9ac1eaae6046b027ff9b0009"} Apr 17 16:31:39.747191 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.747164 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerStarted","Data":"c43140e61ae101d17cfb6c885e7132af9f8ef5819bd07c0fc7af81917a8de127"} Apr 17 16:31:39.748431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.748399 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" event={"ID":"c93f7728-334c-48f1-9627-2bec5d533ccb","Type":"ContainerStarted","Data":"a2bf0dd7d2c86f3821f2bd0be97b6af3d241c2f403dd4d7c4dc65dc5abbaf12c"} Apr 17 16:31:39.748682 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.748669 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:39.750567 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.750547 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:31:39.755310 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.755268 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-rhztb" podStartSLOduration=35.504424446 podStartE2EDuration="47.755253733s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:31:26.595827309 +0000 UTC m=+34.493588742" lastFinishedPulling="2026-04-17 16:31:38.84665658 +0000 UTC m=+46.744418029" observedRunningTime="2026-04-17 16:31:39.755138877 +0000 UTC m=+47.652900333" watchObservedRunningTime="2026-04-17 16:31:39.755253733 +0000 UTC m=+47.653015185" Apr 17 16:31:39.769856 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.769813 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gpb6x" podStartSLOduration=9.213479132 podStartE2EDuration="21.7698007s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:26.300778723 +0000 UTC m=+34.198540172" lastFinishedPulling="2026-04-17 16:31:38.857100303 +0000 UTC m=+46.754861740" observedRunningTime="2026-04-17 16:31:39.769046575 +0000 UTC m=+47.666808036" watchObservedRunningTime="2026-04-17 16:31:39.7698007 +0000 UTC m=+47.667562154" Apr 17 16:31:39.820307 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.820247 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" podStartSLOduration=5.051271474 podStartE2EDuration="17.820225998s" podCreationTimestamp="2026-04-17 16:31:22 +0000 UTC" firstStartedPulling="2026-04-17 16:31:26.07719037 +0000 UTC m=+33.974951809" lastFinishedPulling="2026-04-17 16:31:38.846144887 +0000 UTC m=+46.743906333" observedRunningTime="2026-04-17 16:31:39.803617711 +0000 UTC m=+47.701379167" watchObservedRunningTime="2026-04-17 16:31:39.820225998 +0000 UTC m=+47.717987479" Apr 17 16:31:39.821003 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:39.820959 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" podStartSLOduration=5.026476568 podStartE2EDuration="17.820949814s" podCreationTimestamp="2026-04-17 16:31:22 +0000 UTC" firstStartedPulling="2026-04-17 16:31:26.051541296 +0000 UTC m=+33.949302736" lastFinishedPulling="2026-04-17 16:31:38.846014545 +0000 UTC m=+46.743775982" observedRunningTime="2026-04-17 16:31:39.819779381 +0000 UTC m=+47.717540837" watchObservedRunningTime="2026-04-17 16:31:39.820949814 +0000 UTC m=+47.718711272" Apr 17 16:31:40.754251 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:40.754215 2561 generic.go:358] "Generic (PLEG): container finished" podID="8fbceac7-0307-49d3-8986-e1b49a4b6760" containerID="06c454809f23142c5a988fd3f9d7d6e6fb7ff5599164456e4e40d8874af056e1" exitCode=0 Apr 17 16:31:40.754709 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:40.754331 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerDied","Data":"06c454809f23142c5a988fd3f9d7d6e6fb7ff5599164456e4e40d8874af056e1"} Apr 17 16:31:41.361564 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.361526 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:41.361740 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.361642 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:41.361740 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.361654 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:41.361740 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.361704 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:57.361691325 +0000 UTC m=+65.259452759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:41.462233 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.462196 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:41.462418 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.462274 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:41.462418 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.462352 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:41.462418 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.462415 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:57.462400523 +0000 UTC m=+65.360161957 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:41.462537 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.462361 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:41.462537 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:41.462483 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:31:57.462472287 +0000 UTC m=+65.360233724 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:41.761236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.761155 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-642q4" event={"ID":"8fbceac7-0307-49d3-8986-e1b49a4b6760","Type":"ContainerStarted","Data":"0d45a991baa50dd761730d8f8439c2336f364a7c4fb1c396279c2f6ce898bdad"} Apr 17 16:31:41.762859 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.762828 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerStarted","Data":"569ea815012c05f314170eb65fec6f27aa619c71d9f724f2ba63f2ea09275d67"} Apr 17 16:31:41.762985 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.762865 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerStarted","Data":"522998d09127fcfb77f806b5b2dc10dd9d7acb090b87506dbf8c91bb96f230b2"} Apr 17 16:31:41.790657 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.790603 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-642q4" podStartSLOduration=6.431324029 podStartE2EDuration="49.790588484s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:30:55.487394705 +0000 UTC m=+3.385156150" lastFinishedPulling="2026-04-17 16:31:38.846659157 +0000 UTC m=+46.744420605" observedRunningTime="2026-04-17 16:31:41.790064334 +0000 UTC m=+49.687825791" watchObservedRunningTime="2026-04-17 16:31:41.790588484 +0000 UTC m=+49.688349938" Apr 17 16:31:41.821116 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:41.821048 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" podStartSLOduration=4.984483334 podStartE2EDuration="19.821035787s" podCreationTimestamp="2026-04-17 16:31:22 +0000 UTC" firstStartedPulling="2026-04-17 16:31:26.053000478 +0000 UTC m=+33.950761916" lastFinishedPulling="2026-04-17 16:31:40.889552916 +0000 UTC m=+48.787314369" observedRunningTime="2026-04-17 16:31:41.819816436 +0000 UTC m=+49.717577891" watchObservedRunningTime="2026-04-17 16:31:41.821035787 +0000 UTC m=+49.718797242" Apr 17 16:31:50.700714 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:50.700670 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qkt5m" Apr 17 16:31:57.385545 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:57.385501 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:31:57.386007 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.385652 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:31:57.386007 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.385673 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:31:57.386007 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.385727 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:29.385712499 +0000 UTC m=+97.283473933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:31:57.486443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:57.486408 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:31:57.486597 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:57.486466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:31:57.486597 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.486562 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:57.486597 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.486569 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:57.486716 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.486626 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:32:29.486612184 +0000 UTC m=+97.384373621 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:31:57.486716 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:57.486640 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:29.486634936 +0000 UTC m=+97.384396369 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:31:58.292537 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:31:58.292490 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:31:58.292724 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:58.292630 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:31:58.292724 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:31:58.292688 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:02.292674771 +0000 UTC m=+130.190436204 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : secret "metrics-daemon-secret" not found Apr 17 16:32:10.756834 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:32:10.756701 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-rhztb" Apr 17 16:32:29.435828 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:32:29.435732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:32:29.436272 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.435891 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:32:29.436272 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.435913 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c58cd94cc-lznkk: secret "image-registry-tls" not found Apr 17 16:32:29.436272 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.435968 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls podName:99bc45f2-d916-4072-897e-222a1077ef80 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:33.435953738 +0000 UTC m=+161.333715171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls") pod "image-registry-6c58cd94cc-lznkk" (UID: "99bc45f2-d916-4072-897e-222a1077ef80") : secret "image-registry-tls" not found Apr 17 16:32:29.537063 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:32:29.537022 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:32:29.537283 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:32:29.537101 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:32:29.537283 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.537222 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:29.537283 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.537276 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls podName:be864e9c-2445-4ad3-8453-a28d5bd5fda2 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:33.537262458 +0000 UTC m=+161.435023890 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls") pod "dns-default-hhlb4" (UID: "be864e9c-2445-4ad3-8453-a28d5bd5fda2") : secret "dns-default-metrics-tls" not found Apr 17 16:32:29.537446 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.537218 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:29.537446 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:32:29.537383 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert podName:947405ad-c2f6-4581-b056-296308a2cc2f nodeName:}" failed. No retries permitted until 2026-04-17 16:33:33.537361754 +0000 UTC m=+161.435123187 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert") pod "ingress-canary-hcv89" (UID: "947405ad-c2f6-4581-b056-296308a2cc2f") : secret "canary-serving-cert" not found Apr 17 16:33:02.371259 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:02.371205 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:33:02.371755 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:02.371363 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:02.371755 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:02.371445 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs podName:8a91f76e-d64e-4d72-92ff-c27c12f465d2 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:04.371426503 +0000 UTC m=+252.269187953 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs") pod "network-metrics-daemon-wdt9k" (UID: "8a91f76e-d64e-4d72-92ff-c27c12f465d2") : secret "metrics-daemon-secret" not found Apr 17 16:33:08.291857 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:08.291825 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lpfgv_65e080a5-8430-43f5-b120-a3fff8102219/dns-node-resolver/0.log" Apr 17 16:33:09.092420 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:09.092392 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-chsm2_05559253-f52c-49e6-a8e0-1751350669ac/node-ca/0.log" Apr 17 16:33:26.276690 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.276661 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fckrp"] Apr 17 16:33:26.279708 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.279691 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.282297 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.282276 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:26.283377 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.283355 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:26.283465 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.283428 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-shqjl\"" Apr 17 16:33:26.283465 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.283428 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:26.283465 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.283457 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:26.300780 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.300757 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fckrp"] Apr 17 16:33:26.355001 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.354969 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.355181 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.355021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-data-volume\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.355228 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.355185 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-crio-socket\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.355228 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.355217 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.355288 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.355246 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt69z\" (UniqueName: \"kubernetes.io/projected/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-api-access-rt69z\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.455986 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.455951 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456000 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-data-volume\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456041 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-crio-socket\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456060 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456117 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rt69z\" (UniqueName: \"kubernetes.io/projected/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-api-access-rt69z\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456176 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-crio-socket\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456453 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-data-volume\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.456531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.456512 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.459034 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.459010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.465877 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.465846 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt69z\" (UniqueName: \"kubernetes.io/projected/74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37-kube-api-access-rt69z\") pod \"insights-runtime-extractor-fckrp\" (UID: \"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37\") " pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.588167 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.588137 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fckrp" Apr 17 16:33:26.720211 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:26.720186 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fckrp"] Apr 17 16:33:26.722548 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:33:26.722524 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a3f7f5_a863_48f7_bd64_3d1d9ca3fd37.slice/crio-2adcfa5d8363fbe493bc3d00c911e331110f845053d51fa9cb659ca13b7c60b2 WatchSource:0}: Error finding container 2adcfa5d8363fbe493bc3d00c911e331110f845053d51fa9cb659ca13b7c60b2: Status 404 returned error can't find the container with id 2adcfa5d8363fbe493bc3d00c911e331110f845053d51fa9cb659ca13b7c60b2 Apr 17 16:33:27.008128 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:27.008023 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fckrp" event={"ID":"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37","Type":"ContainerStarted","Data":"558dec490c1d8ab537b45b18c52456ec6c444d914a1c9c609e213ed2284bd076"} Apr 17 16:33:27.008128 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:27.008057 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fckrp" event={"ID":"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37","Type":"ContainerStarted","Data":"2adcfa5d8363fbe493bc3d00c911e331110f845053d51fa9cb659ca13b7c60b2"} Apr 17 16:33:28.012226 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:28.012188 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fckrp" event={"ID":"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37","Type":"ContainerStarted","Data":"b26f26718c619f8e09a432904d483d82d9eef0dd6b764266ccb6030099ce0f56"} Apr 17 16:33:28.544578 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:28.544536 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wdt9k" podUID="8a91f76e-d64e-4d72-92ff-c27c12f465d2" Apr 17 16:33:28.560719 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:28.560679 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" podUID="99bc45f2-d916-4072-897e-222a1077ef80" Apr 17 16:33:28.610101 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:28.610042 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hcv89" podUID="947405ad-c2f6-4581-b056-296308a2cc2f" Apr 17 16:33:28.618365 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:33:28.618331 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hhlb4" podUID="be864e9c-2445-4ad3-8453-a28d5bd5fda2" Apr 17 16:33:29.016653 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:29.016619 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:29.016653 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:29.016630 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:29.017045 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:29.016622 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fckrp" event={"ID":"74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37","Type":"ContainerStarted","Data":"da1fdeac61cc6339749c998fbd44cf27d654ee160c4f55162122aa2518788ef4"} Apr 17 16:33:29.017045 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:29.016725 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:33:29.034559 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:29.034512 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fckrp" podStartSLOduration=0.902564944 podStartE2EDuration="3.034498763s" podCreationTimestamp="2026-04-17 16:33:26 +0000 UTC" firstStartedPulling="2026-04-17 16:33:26.772912779 +0000 UTC m=+154.670674212" lastFinishedPulling="2026-04-17 16:33:28.904846579 +0000 UTC m=+156.802608031" observedRunningTime="2026-04-17 16:33:29.0337223 +0000 UTC m=+156.931483757" watchObservedRunningTime="2026-04-17 16:33:29.034498763 +0000 UTC m=+156.932260218" Apr 17 16:33:33.512751 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.512674 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:33.515048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.515026 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"image-registry-6c58cd94cc-lznkk\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:33.520556 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.520531 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6hp7\"" Apr 17 16:33:33.528756 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.528739 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:33.613775 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.613734 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:33.613929 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.613902 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:33:33.616433 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.616406 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/be864e9c-2445-4ad3-8453-a28d5bd5fda2-metrics-tls\") pod \"dns-default-hhlb4\" (UID: \"be864e9c-2445-4ad3-8453-a28d5bd5fda2\") " pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:33.617449 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.617425 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/947405ad-c2f6-4581-b056-296308a2cc2f-cert\") pod \"ingress-canary-hcv89\" (UID: \"947405ad-c2f6-4581-b056-296308a2cc2f\") " pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:33:33.647223 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.647194 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:33:33.651209 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:33:33.651184 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99bc45f2_d916_4072_897e_222a1077ef80.slice/crio-d7c6148ca226c5e299d15bf525b6217430e7cc43199cd6ff654d7f81f6a95b00 WatchSource:0}: Error finding container d7c6148ca226c5e299d15bf525b6217430e7cc43199cd6ff654d7f81f6a95b00: Status 404 returned error can't find the container with id d7c6148ca226c5e299d15bf525b6217430e7cc43199cd6ff654d7f81f6a95b00 Apr 17 16:33:33.820168 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.820133 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qsxk4\"" Apr 17 16:33:33.820168 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.820152 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-78gjs\"" Apr 17 16:33:33.828512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.828489 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:33.828637 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.828570 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hcv89" Apr 17 16:33:33.954796 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.954738 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hcv89"] Apr 17 16:33:33.959352 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:33:33.959319 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod947405ad_c2f6_4581_b056_296308a2cc2f.slice/crio-ddfcfc4ee4e9377d1eace2497921ad78e48a77b9d06038e30d783c86dc0229e7 WatchSource:0}: Error finding container ddfcfc4ee4e9377d1eace2497921ad78e48a77b9d06038e30d783c86dc0229e7: Status 404 returned error can't find the container with id ddfcfc4ee4e9377d1eace2497921ad78e48a77b9d06038e30d783c86dc0229e7 Apr 17 16:33:33.972126 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:33.972096 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hhlb4"] Apr 17 16:33:33.974901 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:33:33.974869 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe864e9c_2445_4ad3_8453_a28d5bd5fda2.slice/crio-ff5a7df7d6f71b2a3794e45ab027da0d89fa3f53f7a37c1583855b3a84e556ea WatchSource:0}: Error finding container ff5a7df7d6f71b2a3794e45ab027da0d89fa3f53f7a37c1583855b3a84e556ea: Status 404 returned error can't find the container with id ff5a7df7d6f71b2a3794e45ab027da0d89fa3f53f7a37c1583855b3a84e556ea Apr 17 16:33:34.033535 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.033485 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhlb4" event={"ID":"be864e9c-2445-4ad3-8453-a28d5bd5fda2","Type":"ContainerStarted","Data":"ff5a7df7d6f71b2a3794e45ab027da0d89fa3f53f7a37c1583855b3a84e556ea"} Apr 17 16:33:34.034824 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.034794 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" event={"ID":"99bc45f2-d916-4072-897e-222a1077ef80","Type":"ContainerStarted","Data":"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d"} Apr 17 16:33:34.034964 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.034829 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" event={"ID":"99bc45f2-d916-4072-897e-222a1077ef80","Type":"ContainerStarted","Data":"d7c6148ca226c5e299d15bf525b6217430e7cc43199cd6ff654d7f81f6a95b00"} Apr 17 16:33:34.035035 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.034973 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:34.035837 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.035818 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hcv89" event={"ID":"947405ad-c2f6-4581-b056-296308a2cc2f","Type":"ContainerStarted","Data":"ddfcfc4ee4e9377d1eace2497921ad78e48a77b9d06038e30d783c86dc0229e7"} Apr 17 16:33:34.058532 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:34.058475 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" podStartSLOduration=141.058461335 podStartE2EDuration="2m21.058461335s" podCreationTimestamp="2026-04-17 16:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:34.057350928 +0000 UTC m=+161.955112384" watchObservedRunningTime="2026-04-17 16:33:34.058461335 +0000 UTC m=+161.956222790" Apr 17 16:33:36.043146 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:36.043116 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hcv89" event={"ID":"947405ad-c2f6-4581-b056-296308a2cc2f","Type":"ContainerStarted","Data":"a093cd128b04a251a8966a3a2c9b5eede549a2f976b1c0572d6c6a9547d98e89"} Apr 17 16:33:36.057435 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:36.057386 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hcv89" podStartSLOduration=129.077527399 podStartE2EDuration="2m11.05736602s" podCreationTimestamp="2026-04-17 16:31:25 +0000 UTC" firstStartedPulling="2026-04-17 16:33:33.961221991 +0000 UTC m=+161.858983424" lastFinishedPulling="2026-04-17 16:33:35.941060612 +0000 UTC m=+163.838822045" observedRunningTime="2026-04-17 16:33:36.056997312 +0000 UTC m=+163.954758768" watchObservedRunningTime="2026-04-17 16:33:36.05736602 +0000 UTC m=+163.955127476" Apr 17 16:33:37.047158 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:37.047112 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhlb4" event={"ID":"be864e9c-2445-4ad3-8453-a28d5bd5fda2","Type":"ContainerStarted","Data":"6f3155b480c8a8922ddd0d47d975a6b36ce90ca499bf5ffe8b0e61f34057851c"} Apr 17 16:33:37.047158 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:37.047162 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hhlb4" event={"ID":"be864e9c-2445-4ad3-8453-a28d5bd5fda2","Type":"ContainerStarted","Data":"015b5ac16fda59b807d66061a0a4e49a232820b8cfdef8878219d6da6653981b"} Apr 17 16:33:37.065094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:37.065026 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hhlb4" podStartSLOduration=130.104023448 podStartE2EDuration="2m12.065010697s" podCreationTimestamp="2026-04-17 16:31:25 +0000 UTC" firstStartedPulling="2026-04-17 16:33:33.976706609 +0000 UTC m=+161.874468042" lastFinishedPulling="2026-04-17 16:33:35.937693855 +0000 UTC m=+163.835455291" observedRunningTime="2026-04-17 16:33:37.06378658 +0000 UTC m=+164.961548035" watchObservedRunningTime="2026-04-17 16:33:37.065010697 +0000 UTC m=+164.962772152" Apr 17 16:33:38.049696 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:38.049668 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:39.749569 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:39.749499 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" podUID="c93f7728-334c-48f1-9627-2bec5d533ccb" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 17 16:33:40.059814 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.059787 2561 generic.go:358] "Generic (PLEG): container finished" podID="c93f7728-334c-48f1-9627-2bec5d533ccb" containerID="a2bf0dd7d2c86f3821f2bd0be97b6af3d241c2f403dd4d7c4dc65dc5abbaf12c" exitCode=1 Apr 17 16:33:40.059993 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.059868 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" event={"ID":"c93f7728-334c-48f1-9627-2bec5d533ccb","Type":"ContainerDied","Data":"a2bf0dd7d2c86f3821f2bd0be97b6af3d241c2f403dd4d7c4dc65dc5abbaf12c"} Apr 17 16:33:40.060279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.060264 2561 scope.go:117] "RemoveContainer" containerID="a2bf0dd7d2c86f3821f2bd0be97b6af3d241c2f403dd4d7c4dc65dc5abbaf12c" Apr 17 16:33:40.060985 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.060958 2561 generic.go:358] "Generic (PLEG): container finished" podID="85addf4e-0ee6-43f5-a06c-f450323824f1" containerID="db29927fa15efc55e66162e7cec232b1363b1f1598cfe59dfdf10a2addfcd3a4" exitCode=255 Apr 17 16:33:40.061035 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.061007 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" event={"ID":"85addf4e-0ee6-43f5-a06c-f450323824f1","Type":"ContainerDied","Data":"db29927fa15efc55e66162e7cec232b1363b1f1598cfe59dfdf10a2addfcd3a4"} Apr 17 16:33:40.061258 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:40.061244 2561 scope.go:117] "RemoveContainer" containerID="db29927fa15efc55e66162e7cec232b1363b1f1598cfe59dfdf10a2addfcd3a4" Apr 17 16:33:41.064973 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:41.064932 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-764867bc4f-btcv4" event={"ID":"85addf4e-0ee6-43f5-a06c-f450323824f1","Type":"ContainerStarted","Data":"3866c02917d638b1d0eef2ae94fd608c569ad354338f3a482790b0b4f3118c6a"} Apr 17 16:33:41.066403 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:41.066360 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" event={"ID":"c93f7728-334c-48f1-9627-2bec5d533ccb","Type":"ContainerStarted","Data":"c35e9158123506c7a363b53ecbd1f95e1327c107149f6d6e0d03c811d8105297"} Apr 17 16:33:41.066641 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:41.066620 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:33:41.067171 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:41.067149 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-64bc78d648-wrd2v" Apr 17 16:33:43.518168 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:43.518115 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:33:48.058880 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.058849 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hhlb4" Apr 17 16:33:48.664336 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.664206 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6n8jh"] Apr 17 16:33:48.670599 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.670575 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.679886 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.679862 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:33:48.680635 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.680616 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:33:48.682613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.682590 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:33:48.682789 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.682769 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:33:48.682838 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.682803 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:33:48.690360 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.690331 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:33:48.699594 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.699564 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bxbxp\"" Apr 17 16:33:48.834747 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834716 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-sys\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834747 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834750 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-metrics-client-ca\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834768 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsdvq\" (UniqueName: \"kubernetes.io/projected/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-kube-api-access-bsdvq\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-wtmp\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834835 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-root\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834855 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834889 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.834947 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834914 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-tls\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.835167 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.834973 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-textfile\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936329 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-sys\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936383 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-metrics-client-ca\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936402 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsdvq\" (UniqueName: \"kubernetes.io/projected/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-kube-api-access-bsdvq\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936423 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-wtmp\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936457 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-sys\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936529 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-root\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936540 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-wtmp\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936563 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936581 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-root\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936610 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936638 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-tls\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.936776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-textfile\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.937101 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.936951 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-textfile\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.937101 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.937025 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-metrics-client-ca\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.937187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.937174 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-accelerators-collector-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.938898 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.938873 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-tls\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.938987 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.938901 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.955023 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.954989 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsdvq\" (UniqueName: \"kubernetes.io/projected/5686acea-41ea-4c2e-a01a-b0faaf0e86e6-kube-api-access-bsdvq\") pod \"node-exporter-6n8jh\" (UID: \"5686acea-41ea-4c2e-a01a-b0faaf0e86e6\") " pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.979676 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:48.979638 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6n8jh" Apr 17 16:33:48.988610 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:33:48.988571 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5686acea_41ea_4c2e_a01a_b0faaf0e86e6.slice/crio-aeefcec1cb918835221712d8d32896633119c665e7a65299b10fa731da9abc91 WatchSource:0}: Error finding container aeefcec1cb918835221712d8d32896633119c665e7a65299b10fa731da9abc91: Status 404 returned error can't find the container with id aeefcec1cb918835221712d8d32896633119c665e7a65299b10fa731da9abc91 Apr 17 16:33:49.088622 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:49.088586 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8jh" event={"ID":"5686acea-41ea-4c2e-a01a-b0faaf0e86e6","Type":"ContainerStarted","Data":"aeefcec1cb918835221712d8d32896633119c665e7a65299b10fa731da9abc91"} Apr 17 16:33:50.092779 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:50.092740 2561 generic.go:358] "Generic (PLEG): container finished" podID="5686acea-41ea-4c2e-a01a-b0faaf0e86e6" containerID="3eca5167eb29c44386b2b3487c3faffa9fed3bd0c7bfe0006f0806d5703c435e" exitCode=0 Apr 17 16:33:50.093176 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:50.092806 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8jh" event={"ID":"5686acea-41ea-4c2e-a01a-b0faaf0e86e6","Type":"ContainerDied","Data":"3eca5167eb29c44386b2b3487c3faffa9fed3bd0c7bfe0006f0806d5703c435e"} Apr 17 16:33:51.096922 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:51.096889 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8jh" event={"ID":"5686acea-41ea-4c2e-a01a-b0faaf0e86e6","Type":"ContainerStarted","Data":"bb49b6442d0187c85c7cf1e30db2b7771d7003c7e89322d074ec2ff030688757"} Apr 17 16:33:51.096922 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:51.096926 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6n8jh" event={"ID":"5686acea-41ea-4c2e-a01a-b0faaf0e86e6","Type":"ContainerStarted","Data":"270a1f560c86d5394ddf1266bafdee4dde776c9d1003ba33e8472d6e623f0468"} Apr 17 16:33:51.117850 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:51.117800 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6n8jh" podStartSLOduration=2.229328912 podStartE2EDuration="3.117786193s" podCreationTimestamp="2026-04-17 16:33:48 +0000 UTC" firstStartedPulling="2026-04-17 16:33:48.990590119 +0000 UTC m=+176.888351569" lastFinishedPulling="2026-04-17 16:33:49.879047402 +0000 UTC m=+177.776808850" observedRunningTime="2026-04-17 16:33:51.116140995 +0000 UTC m=+179.013902450" watchObservedRunningTime="2026-04-17 16:33:51.117786193 +0000 UTC m=+179.015547649" Apr 17 16:33:53.534691 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:53.534657 2561 patch_prober.go:28] interesting pod/image-registry-6c58cd94cc-lznkk container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:53.535055 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:53.534715 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" podUID="99bc45f2-d916-4072-897e-222a1077ef80" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:33:55.043169 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:55.043137 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:33:58.567374 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:33:58.567338 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:34:15.833920 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:15.833881 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" podUID="3ef41909-613b-41f9-9401-d08531c9d28c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:34:23.586287 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.586221 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" podUID="99bc45f2-d916-4072-897e-222a1077ef80" containerName="registry" containerID="cri-o://9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d" gracePeriod=30 Apr 17 16:34:23.827216 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.827185 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:34:23.899427 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899344 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899427 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899390 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899427 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899413 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899738 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899440 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899738 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899474 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899738 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899491 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899738 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899519 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66mq\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899738 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899551 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets\") pod \"99bc45f2-d916-4072-897e-222a1077ef80\" (UID: \"99bc45f2-d916-4072-897e-222a1077ef80\") " Apr 17 16:34:23.899991 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899953 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:23.899991 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.899969 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:23.901885 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.901851 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:23.902014 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.901881 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:23.902098 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.902047 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:23.902163 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.902126 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:23.902336 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.902315 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq" (OuterVolumeSpecName: "kube-api-access-b66mq") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "kube-api-access-b66mq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:23.910790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:23.910758 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "99bc45f2-d916-4072-897e-222a1077ef80" (UID: "99bc45f2-d916-4072-897e-222a1077ef80"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:24.000237 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000190 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-registry-certificates\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000237 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000233 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99bc45f2-d916-4072-897e-222a1077ef80-trusted-ca\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000237 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000244 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b66mq\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-kube-api-access-b66mq\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000237 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000255 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-installation-pull-secrets\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000265 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-registry-tls\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000273 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99bc45f2-d916-4072-897e-222a1077ef80-bound-sa-token\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000281 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/99bc45f2-d916-4072-897e-222a1077ef80-ca-trust-extracted\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.000512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.000290 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/99bc45f2-d916-4072-897e-222a1077ef80-image-registry-private-configuration\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:34:24.179937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.179850 2561 generic.go:358] "Generic (PLEG): container finished" podID="99bc45f2-d916-4072-897e-222a1077ef80" containerID="9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d" exitCode=0 Apr 17 16:34:24.179937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.179907 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" Apr 17 16:34:24.179937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.179924 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" event={"ID":"99bc45f2-d916-4072-897e-222a1077ef80","Type":"ContainerDied","Data":"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d"} Apr 17 16:34:24.180217 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.179957 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c58cd94cc-lznkk" event={"ID":"99bc45f2-d916-4072-897e-222a1077ef80","Type":"ContainerDied","Data":"d7c6148ca226c5e299d15bf525b6217430e7cc43199cd6ff654d7f81f6a95b00"} Apr 17 16:34:24.180217 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.179974 2561 scope.go:117] "RemoveContainer" containerID="9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d" Apr 17 16:34:24.188129 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.188111 2561 scope.go:117] "RemoveContainer" containerID="9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d" Apr 17 16:34:24.188393 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:34:24.188365 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d\": container with ID starting with 9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d not found: ID does not exist" containerID="9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d" Apr 17 16:34:24.188487 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.188398 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d"} err="failed to get container status \"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d\": rpc error: code = NotFound desc = could not find container \"9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d\": container with ID starting with 9463bb238d6211792843bd4424bf327569cb19eaff60f20b84903ecb9d61d56d not found: ID does not exist" Apr 17 16:34:24.208349 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.208318 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:34:24.215577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.215546 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c58cd94cc-lznkk"] Apr 17 16:34:24.522459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:24.522378 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bc45f2-d916-4072-897e-222a1077ef80" path="/var/lib/kubelet/pods/99bc45f2-d916-4072-897e-222a1077ef80/volumes" Apr 17 16:34:25.834550 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:25.834514 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" podUID="3ef41909-613b-41f9-9401-d08531c9d28c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:34:35.834253 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:35.834204 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" podUID="3ef41909-613b-41f9-9401-d08531c9d28c" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 16:34:35.834623 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:35.834289 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" Apr 17 16:34:35.834751 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:35.834733 2561 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"569ea815012c05f314170eb65fec6f27aa619c71d9f724f2ba63f2ea09275d67"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 16:34:35.834787 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:35.834770 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" podUID="3ef41909-613b-41f9-9401-d08531c9d28c" containerName="service-proxy" containerID="cri-o://569ea815012c05f314170eb65fec6f27aa619c71d9f724f2ba63f2ea09275d67" gracePeriod=30 Apr 17 16:34:36.211100 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:36.211004 2561 generic.go:358] "Generic (PLEG): container finished" podID="3ef41909-613b-41f9-9401-d08531c9d28c" containerID="569ea815012c05f314170eb65fec6f27aa619c71d9f724f2ba63f2ea09275d67" exitCode=2 Apr 17 16:34:36.211100 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:36.211059 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerDied","Data":"569ea815012c05f314170eb65fec6f27aa619c71d9f724f2ba63f2ea09275d67"} Apr 17 16:34:36.211271 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:34:36.211106 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-74b858c94b-jjklq" event={"ID":"3ef41909-613b-41f9-9401-d08531c9d28c","Type":"ContainerStarted","Data":"5e9ebae626da8edb4570799a0af52e6222403e3f8e79cceb5b6e3459a0990bc6"} Apr 17 16:35:04.395469 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:04.395376 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:35:04.397641 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:04.397619 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a91f76e-d64e-4d72-92ff-c27c12f465d2-metrics-certs\") pod \"network-metrics-daemon-wdt9k\" (UID: \"8a91f76e-d64e-4d72-92ff-c27c12f465d2\") " pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:35:04.521687 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:04.521660 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qkztw\"" Apr 17 16:35:04.529659 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:04.529634 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wdt9k" Apr 17 16:35:04.641424 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:04.641390 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wdt9k"] Apr 17 16:35:04.644677 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:35:04.644648 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a91f76e_d64e_4d72_92ff_c27c12f465d2.slice/crio-869765824be3193d1c4ada5114326d598dfe949a7d2408e6eac7bcd106865425 WatchSource:0}: Error finding container 869765824be3193d1c4ada5114326d598dfe949a7d2408e6eac7bcd106865425: Status 404 returned error can't find the container with id 869765824be3193d1c4ada5114326d598dfe949a7d2408e6eac7bcd106865425 Apr 17 16:35:05.283561 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:05.283520 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdt9k" event={"ID":"8a91f76e-d64e-4d72-92ff-c27c12f465d2","Type":"ContainerStarted","Data":"869765824be3193d1c4ada5114326d598dfe949a7d2408e6eac7bcd106865425"} Apr 17 16:35:06.288039 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:06.287946 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdt9k" event={"ID":"8a91f76e-d64e-4d72-92ff-c27c12f465d2","Type":"ContainerStarted","Data":"989b5ded48a74e80c6f33d22909f1307886f8553484fb2439c4763cfb0bc7465"} Apr 17 16:35:06.288039 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:06.287986 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wdt9k" event={"ID":"8a91f76e-d64e-4d72-92ff-c27c12f465d2","Type":"ContainerStarted","Data":"8983d1022304a733762287a76b2d707506735b90bf885162e60f720205ce3970"} Apr 17 16:35:06.305254 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:06.305191 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wdt9k" podStartSLOduration=253.062450613 podStartE2EDuration="4m14.305169s" podCreationTimestamp="2026-04-17 16:30:52 +0000 UTC" firstStartedPulling="2026-04-17 16:35:04.64644724 +0000 UTC m=+252.544208676" lastFinishedPulling="2026-04-17 16:35:05.889165615 +0000 UTC m=+253.786927063" observedRunningTime="2026-04-17 16:35:06.303718902 +0000 UTC m=+254.201480358" watchObservedRunningTime="2026-04-17 16:35:06.305169 +0000 UTC m=+254.202930456" Apr 17 16:35:52.476324 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:35:52.476299 2561 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:36:42.938987 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.938898 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7"] Apr 17 16:36:42.939397 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.939156 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99bc45f2-d916-4072-897e-222a1077ef80" containerName="registry" Apr 17 16:36:42.939397 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.939169 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bc45f2-d916-4072-897e-222a1077ef80" containerName="registry" Apr 17 16:36:42.939397 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.939208 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="99bc45f2-d916-4072-897e-222a1077ef80" containerName="registry" Apr 17 16:36:42.941819 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.941802 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:42.944377 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.944356 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 16:36:42.944501 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.944357 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 16:36:42.945164 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.945147 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 16:36:42.945245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.945167 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 16:36:42.945245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.945157 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p2f7z\"" Apr 17 16:36:42.945245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.945195 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 16:36:42.958565 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:42.958539 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7"] Apr 17 16:36:43.061278 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.061235 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.061450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.061296 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6f82372e-61ae-4d0c-a76e-baa6923968af-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.061450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.061418 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjl4w\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-kube-api-access-gjl4w\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.162049 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.162017 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjl4w\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-kube-api-access-gjl4w\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.162231 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.162056 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.162231 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.162110 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6f82372e-61ae-4d0c-a76e-baa6923968af-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.162309 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.162248 2561 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:36:43.162309 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.162271 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:36:43.162309 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.162291 2561 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 16:36:43.162424 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.162312 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:36:43.162424 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.162384 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates podName:6f82372e-61ae-4d0c-a76e-baa6923968af nodeName:}" failed. No retries permitted until 2026-04-17 16:36:43.662363474 +0000 UTC m=+351.560124910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates") pod "keda-metrics-apiserver-7c9f485588-t7bl7" (UID: "6f82372e-61ae-4d0c-a76e-baa6923968af") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:36:43.162508 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.162453 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6f82372e-61ae-4d0c-a76e-baa6923968af-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.173050 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.173025 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjl4w\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-kube-api-access-gjl4w\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.265363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.265279 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-fpzn2"] Apr 17 16:36:43.268303 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.268282 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.270565 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.270541 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 16:36:43.279373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.279348 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fpzn2"] Apr 17 16:36:43.363181 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.363143 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.363181 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.363187 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnhf\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-kube-api-access-mwnhf\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.463677 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.463635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.463819 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.463686 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnhf\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-kube-api-access-mwnhf\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.463819 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.463787 2561 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 16:36:43.463819 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.463813 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-fpzn2: secret "keda-admission-webhooks-certs" not found Apr 17 16:36:43.463988 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.463864 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates podName:a464f12e-c4d5-4480-ba4d-1600f9de4823 nodeName:}" failed. No retries permitted until 2026-04-17 16:36:43.963848867 +0000 UTC m=+351.861610300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates") pod "keda-admission-cf49989db-fpzn2" (UID: "a464f12e-c4d5-4480-ba4d-1600f9de4823") : secret "keda-admission-webhooks-certs" not found Apr 17 16:36:43.472485 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.472453 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnhf\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-kube-api-access-mwnhf\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.664855 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.664826 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:43.665013 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.664929 2561 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:36:43.665013 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.664943 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:36:43.665013 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.664956 2561 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 16:36:43.665013 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.664972 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:36:43.665175 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:43.665019 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates podName:6f82372e-61ae-4d0c-a76e-baa6923968af nodeName:}" failed. No retries permitted until 2026-04-17 16:36:44.665005948 +0000 UTC m=+352.562767380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates") pod "keda-metrics-apiserver-7c9f485588-t7bl7" (UID: "6f82372e-61ae-4d0c-a76e-baa6923968af") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:36:43.966707 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.966618 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:43.968957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:43.968934 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a464f12e-c4d5-4480-ba4d-1600f9de4823-certificates\") pod \"keda-admission-cf49989db-fpzn2\" (UID: \"a464f12e-c4d5-4480-ba4d-1600f9de4823\") " pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:44.178718 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:44.178667 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:44.309839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:44.309661 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-fpzn2"] Apr 17 16:36:44.312554 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:36:44.312519 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda464f12e_c4d5_4480_ba4d_1600f9de4823.slice/crio-45ca057ef37841e2bee1c1e052613a57c98ed6e4c535eadf908938108abd7342 WatchSource:0}: Error finding container 45ca057ef37841e2bee1c1e052613a57c98ed6e4c535eadf908938108abd7342: Status 404 returned error can't find the container with id 45ca057ef37841e2bee1c1e052613a57c98ed6e4c535eadf908938108abd7342 Apr 17 16:36:44.313725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:44.313709 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:36:44.532859 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:44.532782 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fpzn2" event={"ID":"a464f12e-c4d5-4480-ba4d-1600f9de4823","Type":"ContainerStarted","Data":"45ca057ef37841e2bee1c1e052613a57c98ed6e4c535eadf908938108abd7342"} Apr 17 16:36:44.672157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:44.672118 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:44.672334 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:44.672244 2561 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:36:44.672334 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:44.672256 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:36:44.672334 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:44.672273 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7: references non-existent secret key: tls.crt Apr 17 16:36:44.672334 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:44.672318 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates podName:6f82372e-61ae-4d0c-a76e-baa6923968af nodeName:}" failed. No retries permitted until 2026-04-17 16:36:46.67230499 +0000 UTC m=+354.570066422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates") pod "keda-metrics-apiserver-7c9f485588-t7bl7" (UID: "6f82372e-61ae-4d0c-a76e-baa6923968af") : references non-existent secret key: tls.crt Apr 17 16:36:46.689278 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:46.689242 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:46.689635 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:46.689375 2561 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:36:46.689635 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:46.689391 2561 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:36:46.689635 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:46.689409 2561 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7: references non-existent secret key: tls.crt Apr 17 16:36:46.689635 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:36:46.689463 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates podName:6f82372e-61ae-4d0c-a76e-baa6923968af nodeName:}" failed. No retries permitted until 2026-04-17 16:36:50.689446552 +0000 UTC m=+358.587208002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates") pod "keda-metrics-apiserver-7c9f485588-t7bl7" (UID: "6f82372e-61ae-4d0c-a76e-baa6923968af") : references non-existent secret key: tls.crt Apr 17 16:36:47.543234 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:47.543195 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-fpzn2" event={"ID":"a464f12e-c4d5-4480-ba4d-1600f9de4823","Type":"ContainerStarted","Data":"3b5d4d4077cf6e6b7edfecd7591a5d0c93ab3ab7b4769e5eebff8a41f0c6d4b6"} Apr 17 16:36:47.543479 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:47.543454 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:36:47.560000 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:47.558968 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-fpzn2" podStartSLOduration=2.271299945 podStartE2EDuration="4.558948228s" podCreationTimestamp="2026-04-17 16:36:43 +0000 UTC" firstStartedPulling="2026-04-17 16:36:44.313852229 +0000 UTC m=+352.211613663" lastFinishedPulling="2026-04-17 16:36:46.601500498 +0000 UTC m=+354.499261946" observedRunningTime="2026-04-17 16:36:47.558190599 +0000 UTC m=+355.455952056" watchObservedRunningTime="2026-04-17 16:36:47.558948228 +0000 UTC m=+355.456709684" Apr 17 16:36:50.719730 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:50.719698 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:50.722117 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:50.722095 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6f82372e-61ae-4d0c-a76e-baa6923968af-certificates\") pod \"keda-metrics-apiserver-7c9f485588-t7bl7\" (UID: \"6f82372e-61ae-4d0c-a76e-baa6923968af\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:50.750770 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:50.750742 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:50.862489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:50.862391 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7"] Apr 17 16:36:50.865162 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:36:50.865116 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f82372e_61ae_4d0c_a76e_baa6923968af.slice/crio-e3c2065a81346277fb91f922def8f2efef8a8e90d53ed84d91a2a3b94128524f WatchSource:0}: Error finding container e3c2065a81346277fb91f922def8f2efef8a8e90d53ed84d91a2a3b94128524f: Status 404 returned error can't find the container with id e3c2065a81346277fb91f922def8f2efef8a8e90d53ed84d91a2a3b94128524f Apr 17 16:36:51.554640 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:51.554603 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" event={"ID":"6f82372e-61ae-4d0c-a76e-baa6923968af","Type":"ContainerStarted","Data":"e3c2065a81346277fb91f922def8f2efef8a8e90d53ed84d91a2a3b94128524f"} Apr 17 16:36:53.560563 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:53.560530 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" event={"ID":"6f82372e-61ae-4d0c-a76e-baa6923968af","Type":"ContainerStarted","Data":"462388d84482dfd7b497e7fdfff261b53ee3c84e981dc22165279a63007cfd4c"} Apr 17 16:36:53.560898 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:53.560671 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:36:53.575426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:36:53.575371 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" podStartSLOduration=9.02166522 podStartE2EDuration="11.575352564s" podCreationTimestamp="2026-04-17 16:36:42 +0000 UTC" firstStartedPulling="2026-04-17 16:36:50.866763378 +0000 UTC m=+358.764524812" lastFinishedPulling="2026-04-17 16:36:53.420450704 +0000 UTC m=+361.318212156" observedRunningTime="2026-04-17 16:36:53.575195448 +0000 UTC m=+361.472956907" watchObservedRunningTime="2026-04-17 16:36:53.575352564 +0000 UTC m=+361.473114021" Apr 17 16:37:04.568423 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:37:04.568396 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-t7bl7" Apr 17 16:37:08.548531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:37:08.548505 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-fpzn2" Apr 17 16:38:05.464103 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.464003 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z"] Apr 17 16:38:05.466817 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.466801 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.469342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.469321 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:38:05.470169 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.470152 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-kggq6\"" Apr 17 16:38:05.470246 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.470193 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:38:05.475459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.475437 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z"] Apr 17 16:38:05.643931 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.643897 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mm2\" (UniqueName: \"kubernetes.io/projected/446e45a1-b00b-469f-9068-28457cf96abf-kube-api-access-55mm2\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.644143 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.643946 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/446e45a1-b00b-469f-9068-28457cf96abf-tmp\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.744612 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.744506 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55mm2\" (UniqueName: \"kubernetes.io/projected/446e45a1-b00b-469f-9068-28457cf96abf-kube-api-access-55mm2\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.744612 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.744567 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/446e45a1-b00b-469f-9068-28457cf96abf-tmp\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.744954 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.744932 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/446e45a1-b00b-469f-9068-28457cf96abf-tmp\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.755443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.755410 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mm2\" (UniqueName: \"kubernetes.io/projected/446e45a1-b00b-469f-9068-28457cf96abf-kube-api-access-55mm2\") pod \"openshift-lws-operator-bfc7f696d-nnj8z\" (UID: \"446e45a1-b00b-469f-9068-28457cf96abf\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.776386 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.776359 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" Apr 17 16:38:05.899502 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:05.899473 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z"] Apr 17 16:38:05.902488 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:38:05.902459 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod446e45a1_b00b_469f_9068_28457cf96abf.slice/crio-34e1ffd816f4c37702370d6993d8a5896780e9b9c6879474850e6ca8a2ef2cc3 WatchSource:0}: Error finding container 34e1ffd816f4c37702370d6993d8a5896780e9b9c6879474850e6ca8a2ef2cc3: Status 404 returned error can't find the container with id 34e1ffd816f4c37702370d6993d8a5896780e9b9c6879474850e6ca8a2ef2cc3 Apr 17 16:38:06.755346 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:06.755308 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" event={"ID":"446e45a1-b00b-469f-9068-28457cf96abf","Type":"ContainerStarted","Data":"34e1ffd816f4c37702370d6993d8a5896780e9b9c6879474850e6ca8a2ef2cc3"} Apr 17 16:38:08.762723 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:08.762631 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" event={"ID":"446e45a1-b00b-469f-9068-28457cf96abf","Type":"ContainerStarted","Data":"5f7e634c7d39f67abbfde1c750368cbc560bfc5c03a572992f4f23e488757d17"} Apr 17 16:38:08.781427 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:08.781384 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-nnj8z" podStartSLOduration=1.349801028 podStartE2EDuration="3.781368691s" podCreationTimestamp="2026-04-17 16:38:05 +0000 UTC" firstStartedPulling="2026-04-17 16:38:05.904014402 +0000 UTC m=+433.801775841" lastFinishedPulling="2026-04-17 16:38:08.335582072 +0000 UTC m=+436.233343504" observedRunningTime="2026-04-17 16:38:08.780525756 +0000 UTC m=+436.678287210" watchObservedRunningTime="2026-04-17 16:38:08.781368691 +0000 UTC m=+436.679130147" Apr 17 16:38:37.640819 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.640783 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-md4d5"] Apr 17 16:38:37.643752 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.643731 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.646936 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.646910 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 16:38:37.646936 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.646931 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 16:38:37.647916 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.647884 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-7lvnt\"" Apr 17 16:38:37.657698 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.657673 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-md4d5"] Apr 17 16:38:37.674515 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.674482 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6fa643e9-d4d2-4c95-8df5-160cac5eb758-operator-config\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.674671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.674530 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvd2s\" (UniqueName: \"kubernetes.io/projected/6fa643e9-d4d2-4c95-8df5-160cac5eb758-kube-api-access-wvd2s\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.775463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.775427 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6fa643e9-d4d2-4c95-8df5-160cac5eb758-operator-config\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.775463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.775466 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvd2s\" (UniqueName: \"kubernetes.io/projected/6fa643e9-d4d2-4c95-8df5-160cac5eb758-kube-api-access-wvd2s\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.778530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.778504 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6fa643e9-d4d2-4c95-8df5-160cac5eb758-operator-config\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.784853 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.784825 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvd2s\" (UniqueName: \"kubernetes.io/projected/6fa643e9-d4d2-4c95-8df5-160cac5eb758-kube-api-access-wvd2s\") pod \"servicemesh-operator3-55f49c5f94-md4d5\" (UID: \"6fa643e9-d4d2-4c95-8df5-160cac5eb758\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:37.952492 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:37.952405 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:38.077904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:38.077878 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-md4d5"] Apr 17 16:38:38.080691 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:38:38.080660 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa643e9_d4d2_4c95_8df5_160cac5eb758.slice/crio-27639359af46117b687e0037afcb636b705d35f4afc92dee2e8b4b7ea9b5290c WatchSource:0}: Error finding container 27639359af46117b687e0037afcb636b705d35f4afc92dee2e8b4b7ea9b5290c: Status 404 returned error can't find the container with id 27639359af46117b687e0037afcb636b705d35f4afc92dee2e8b4b7ea9b5290c Apr 17 16:38:38.843709 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:38.843672 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" event={"ID":"6fa643e9-d4d2-4c95-8df5-160cac5eb758","Type":"ContainerStarted","Data":"27639359af46117b687e0037afcb636b705d35f4afc92dee2e8b4b7ea9b5290c"} Apr 17 16:38:41.854771 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:41.854736 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" event={"ID":"6fa643e9-d4d2-4c95-8df5-160cac5eb758","Type":"ContainerStarted","Data":"8a5b2e6c083fde52ec9fe99e283327ae7379d1d010560b832aef96c59aa44963"} Apr 17 16:38:41.855142 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:41.854855 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:41.897091 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:41.897020 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" podStartSLOduration=1.94377468 podStartE2EDuration="4.897004834s" podCreationTimestamp="2026-04-17 16:38:37 +0000 UTC" firstStartedPulling="2026-04-17 16:38:38.082983637 +0000 UTC m=+465.980745069" lastFinishedPulling="2026-04-17 16:38:41.03621379 +0000 UTC m=+468.933975223" observedRunningTime="2026-04-17 16:38:41.892974847 +0000 UTC m=+469.790736302" watchObservedRunningTime="2026-04-17 16:38:41.897004834 +0000 UTC m=+469.794766288" Apr 17 16:38:50.662631 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.662594 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:38:50.665937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.665914 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.672810 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.672777 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 16:38:50.672924 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.672842 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-pm269\"" Apr 17 16:38:50.672924 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.672890 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 16:38:50.673131 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.672842 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:38:50.673378 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.673355 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 16:38:50.673659 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.673640 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 16:38:50.673901 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.673885 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:38:50.685606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.685582 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:38:50.771643 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771602 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771643 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771644 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52nv\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771666 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771749 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771781 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771813 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.771972 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.771870 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873130 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873355 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873140 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873355 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873167 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n52nv\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873355 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873200 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873355 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873237 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873438 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.873654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.873505 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.874059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.874037 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.875827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.875803 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.875929 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.875911 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.876119 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.876098 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.876757 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.876739 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.892765 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.892732 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.894295 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.894273 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52nv\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv\") pod \"istiod-openshift-gateway-7cd77c7ffd-dv6wc\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:50.976779 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:50.976689 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:51.114012 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:51.113965 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:38:51.117234 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:38:51.117206 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1001307d_8d60_4e38_9e36_6e5f54d26cc4.slice/crio-6d41f52efea34b85e375a6fa51a4c185460812f91f8458f9705a9eb6670dd0e5 WatchSource:0}: Error finding container 6d41f52efea34b85e375a6fa51a4c185460812f91f8458f9705a9eb6670dd0e5: Status 404 returned error can't find the container with id 6d41f52efea34b85e375a6fa51a4c185460812f91f8458f9705a9eb6670dd0e5 Apr 17 16:38:51.886805 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:51.886768 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" event={"ID":"1001307d-8d60-4e38-9e36-6e5f54d26cc4","Type":"ContainerStarted","Data":"6d41f52efea34b85e375a6fa51a4c185460812f91f8458f9705a9eb6670dd0e5"} Apr 17 16:38:52.859976 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:52.859941 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-md4d5" Apr 17 16:38:53.544115 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:53.544057 2561 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:38:53.544599 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:53.544178 2561 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:38:53.899933 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:53.899878 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" event={"ID":"1001307d-8d60-4e38-9e36-6e5f54d26cc4","Type":"ContainerStarted","Data":"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a"} Apr 17 16:38:53.900144 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:53.900091 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:38:53.944547 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:53.944490 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" podStartSLOduration=1.519617712 podStartE2EDuration="3.944472005s" podCreationTimestamp="2026-04-17 16:38:50 +0000 UTC" firstStartedPulling="2026-04-17 16:38:51.119001764 +0000 UTC m=+479.016763198" lastFinishedPulling="2026-04-17 16:38:53.543856055 +0000 UTC m=+481.441617491" observedRunningTime="2026-04-17 16:38:53.941110105 +0000 UTC m=+481.838871559" watchObservedRunningTime="2026-04-17 16:38:53.944472005 +0000 UTC m=+481.842233462" Apr 17 16:38:54.905327 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:38:54.905299 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:39:29.260520 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.260477 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq"] Apr 17 16:39:29.268524 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.268501 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:29.271216 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.271196 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-x8kmt\"" Apr 17 16:39:29.271864 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.271843 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 16:39:29.272486 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.272450 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:39:29.272781 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.272759 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:39:29.278435 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.277950 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq"] Apr 17 16:39:29.368895 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.368800 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjgh6\" (UniqueName: \"kubernetes.io/projected/36b7257f-c549-4ab8-bae6-f19e9c6af6ca-kube-api-access-kjgh6\") pod \"dns-operator-controller-manager-844548ff4c-gd8tq\" (UID: \"36b7257f-c549-4ab8-bae6-f19e9c6af6ca\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:29.470049 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.470007 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjgh6\" (UniqueName: \"kubernetes.io/projected/36b7257f-c549-4ab8-bae6-f19e9c6af6ca-kube-api-access-kjgh6\") pod \"dns-operator-controller-manager-844548ff4c-gd8tq\" (UID: \"36b7257f-c549-4ab8-bae6-f19e9c6af6ca\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:29.478580 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.478556 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjgh6\" (UniqueName: \"kubernetes.io/projected/36b7257f-c549-4ab8-bae6-f19e9c6af6ca-kube-api-access-kjgh6\") pod \"dns-operator-controller-manager-844548ff4c-gd8tq\" (UID: \"36b7257f-c549-4ab8-bae6-f19e9c6af6ca\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:29.581530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.581498 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:29.707602 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:29.707570 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq"] Apr 17 16:39:29.710813 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:39:29.710785 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b7257f_c549_4ab8_bae6_f19e9c6af6ca.slice/crio-a37fa625f26141150939b58326e052aecb3b3b0903d5b07e72360f7776614874 WatchSource:0}: Error finding container a37fa625f26141150939b58326e052aecb3b3b0903d5b07e72360f7776614874: Status 404 returned error can't find the container with id a37fa625f26141150939b58326e052aecb3b3b0903d5b07e72360f7776614874 Apr 17 16:39:30.010893 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:30.010801 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" event={"ID":"36b7257f-c549-4ab8-bae6-f19e9c6af6ca","Type":"ContainerStarted","Data":"a37fa625f26141150939b58326e052aecb3b3b0903d5b07e72360f7776614874"} Apr 17 16:39:31.973189 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:31.973152 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-q7jmp"] Apr 17 16:39:31.976383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:31.976360 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:31.978769 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:31.978746 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-w72fv\"" Apr 17 16:39:31.988597 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:31.988571 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-q7jmp"] Apr 17 16:39:31.991242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:31.991219 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrt87\" (UniqueName: \"kubernetes.io/projected/d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06-kube-api-access-zrt87\") pod \"authorino-operator-7587b89b76-q7jmp\" (UID: \"d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06\") " pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:32.092693 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:32.092647 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrt87\" (UniqueName: \"kubernetes.io/projected/d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06-kube-api-access-zrt87\") pod \"authorino-operator-7587b89b76-q7jmp\" (UID: \"d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06\") " pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:32.101836 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:32.101810 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrt87\" (UniqueName: \"kubernetes.io/projected/d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06-kube-api-access-zrt87\") pod \"authorino-operator-7587b89b76-q7jmp\" (UID: \"d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06\") " pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:32.288142 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:32.288042 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:32.430048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:32.430021 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-q7jmp"] Apr 17 16:39:32.438782 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:39:32.436242 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd694cd4e_5c4f_47bd_9ef8_702ad3dc4a06.slice/crio-87a802cf4d42f8c1461fa45c84aa6c1ff4a2ed3746fa2cebf7e177a0ac5c7cb8 WatchSource:0}: Error finding container 87a802cf4d42f8c1461fa45c84aa6c1ff4a2ed3746fa2cebf7e177a0ac5c7cb8: Status 404 returned error can't find the container with id 87a802cf4d42f8c1461fa45c84aa6c1ff4a2ed3746fa2cebf7e177a0ac5c7cb8 Apr 17 16:39:33.022343 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:33.022307 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" event={"ID":"36b7257f-c549-4ab8-bae6-f19e9c6af6ca","Type":"ContainerStarted","Data":"d0ae58c03fe4481324e632832c311e5fbc0a9b944a8484061e4cae94c966b3ff"} Apr 17 16:39:33.022834 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:33.022369 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:33.023443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:33.023422 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" event={"ID":"d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06","Type":"ContainerStarted","Data":"87a802cf4d42f8c1461fa45c84aa6c1ff4a2ed3746fa2cebf7e177a0ac5c7cb8"} Apr 17 16:39:33.044091 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:33.044022 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" podStartSLOduration=1.400122228 podStartE2EDuration="4.04400789s" podCreationTimestamp="2026-04-17 16:39:29 +0000 UTC" firstStartedPulling="2026-04-17 16:39:29.712845797 +0000 UTC m=+517.610607233" lastFinishedPulling="2026-04-17 16:39:32.356731462 +0000 UTC m=+520.254492895" observedRunningTime="2026-04-17 16:39:33.042746246 +0000 UTC m=+520.940507702" watchObservedRunningTime="2026-04-17 16:39:33.04400789 +0000 UTC m=+520.941769346" Apr 17 16:39:34.027811 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.027774 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" event={"ID":"d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06","Type":"ContainerStarted","Data":"a89ca90ba07284b17bc655392f0807cc6fb698eb893fb46887ae7703b93a0e1c"} Apr 17 16:39:34.078304 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.078247 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" podStartSLOduration=1.617775569 podStartE2EDuration="3.078222364s" podCreationTimestamp="2026-04-17 16:39:31 +0000 UTC" firstStartedPulling="2026-04-17 16:39:32.441687947 +0000 UTC m=+520.339449387" lastFinishedPulling="2026-04-17 16:39:33.902134749 +0000 UTC m=+521.799896182" observedRunningTime="2026-04-17 16:39:34.078095361 +0000 UTC m=+521.975856809" watchObservedRunningTime="2026-04-17 16:39:34.078222364 +0000 UTC m=+521.975983824" Apr 17 16:39:34.349917 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.349880 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn"] Apr 17 16:39:34.353275 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.353256 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:34.356048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.356024 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-4xlxw\"" Apr 17 16:39:34.369111 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.369066 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn"] Apr 17 16:39:34.412457 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.412412 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwwk\" (UniqueName: \"kubernetes.io/projected/753159f9-d923-44a6-83cf-0139114f5f7d-kube-api-access-6kwwk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xb4wn\" (UID: \"753159f9-d923-44a6-83cf-0139114f5f7d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:34.512970 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.512926 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwwk\" (UniqueName: \"kubernetes.io/projected/753159f9-d923-44a6-83cf-0139114f5f7d-kube-api-access-6kwwk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xb4wn\" (UID: \"753159f9-d923-44a6-83cf-0139114f5f7d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:34.522999 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.522970 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwwk\" (UniqueName: \"kubernetes.io/projected/753159f9-d923-44a6-83cf-0139114f5f7d-kube-api-access-6kwwk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-xb4wn\" (UID: \"753159f9-d923-44a6-83cf-0139114f5f7d\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:34.662837 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.662749 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:34.798391 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:34.798360 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn"] Apr 17 16:39:34.801399 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:39:34.801369 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod753159f9_d923_44a6_83cf_0139114f5f7d.slice/crio-8967087a6bda779d5dca587484fefe9234f1cf78075f8392efb7aeea2a946c71 WatchSource:0}: Error finding container 8967087a6bda779d5dca587484fefe9234f1cf78075f8392efb7aeea2a946c71: Status 404 returned error can't find the container with id 8967087a6bda779d5dca587484fefe9234f1cf78075f8392efb7aeea2a946c71 Apr 17 16:39:35.032402 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:35.032312 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" event={"ID":"753159f9-d923-44a6-83cf-0139114f5f7d","Type":"ContainerStarted","Data":"8967087a6bda779d5dca587484fefe9234f1cf78075f8392efb7aeea2a946c71"} Apr 17 16:39:35.032759 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:35.032465 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:37.039833 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:37.039801 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" event={"ID":"753159f9-d923-44a6-83cf-0139114f5f7d","Type":"ContainerStarted","Data":"749a05d5692b0d1f92f4aba875fe2eb7c4a1714eab5d8187b1ae9d496e65c7d6"} Apr 17 16:39:37.040204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:37.039903 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:39:37.057745 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:37.057657 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" podStartSLOduration=1.650356868 podStartE2EDuration="3.057639395s" podCreationTimestamp="2026-04-17 16:39:34 +0000 UTC" firstStartedPulling="2026-04-17 16:39:34.803408302 +0000 UTC m=+522.701169738" lastFinishedPulling="2026-04-17 16:39:36.210690831 +0000 UTC m=+524.108452265" observedRunningTime="2026-04-17 16:39:37.055896502 +0000 UTC m=+524.953657957" watchObservedRunningTime="2026-04-17 16:39:37.057639395 +0000 UTC m=+524.955400850" Apr 17 16:39:44.030742 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:44.030708 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-gd8tq" Apr 17 16:39:46.038352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:46.038316 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-q7jmp" Apr 17 16:39:48.045525 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:39:48.045489 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-xb4wn" Apr 17 16:40:28.549365 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.549329 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:28.552456 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.552438 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.555192 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.555170 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 16:40:28.555298 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.555189 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8qclv\"" Apr 17 16:40:28.561171 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.561141 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:28.635724 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.635696 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.635885 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.635743 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjvz7\" (UniqueName: \"kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.645847 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.645815 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:28.736939 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.736903 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.737168 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.736954 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjvz7\" (UniqueName: \"kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.737650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.737624 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.744850 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.744824 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjvz7\" (UniqueName: \"kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7\") pod \"limitador-limitador-64c8f475fb-hvd7s\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.862236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.862192 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:28.989992 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:28.989756 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:28.998035 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:40:28.997994 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d04f860_b0e0_4352_8208_3854fac6c2b4.slice/crio-48fb930566d1cf850defb3108e8e67b4971215ff0afba173735ce57644e9d2f2 WatchSource:0}: Error finding container 48fb930566d1cf850defb3108e8e67b4971215ff0afba173735ce57644e9d2f2: Status 404 returned error can't find the container with id 48fb930566d1cf850defb3108e8e67b4971215ff0afba173735ce57644e9d2f2 Apr 17 16:40:29.202055 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:29.201968 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" event={"ID":"7d04f860-b0e0-4352-8208-3854fac6c2b4","Type":"ContainerStarted","Data":"48fb930566d1cf850defb3108e8e67b4971215ff0afba173735ce57644e9d2f2"} Apr 17 16:40:33.217681 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:33.217641 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" event={"ID":"7d04f860-b0e0-4352-8208-3854fac6c2b4","Type":"ContainerStarted","Data":"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e"} Apr 17 16:40:33.218053 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:33.217711 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:33.240661 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:33.240595 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" podStartSLOduration=1.542132668 podStartE2EDuration="5.240580954s" podCreationTimestamp="2026-04-17 16:40:28 +0000 UTC" firstStartedPulling="2026-04-17 16:40:29.0003772 +0000 UTC m=+576.898138646" lastFinishedPulling="2026-04-17 16:40:32.698825485 +0000 UTC m=+580.596586932" observedRunningTime="2026-04-17 16:40:33.239678812 +0000 UTC m=+581.137440281" watchObservedRunningTime="2026-04-17 16:40:33.240580954 +0000 UTC m=+581.138342409" Apr 17 16:40:41.951965 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:41.951924 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:41.952448 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:41.952203 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" podUID="7d04f860-b0e0-4352-8208-3854fac6c2b4" containerName="limitador" containerID="cri-o://1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e" gracePeriod=30 Apr 17 16:40:41.952837 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:41.952806 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:42.897386 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:42.897359 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:43.052901 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.052782 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file\") pod \"7d04f860-b0e0-4352-8208-3854fac6c2b4\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " Apr 17 16:40:43.052901 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.052872 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjvz7\" (UniqueName: \"kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7\") pod \"7d04f860-b0e0-4352-8208-3854fac6c2b4\" (UID: \"7d04f860-b0e0-4352-8208-3854fac6c2b4\") " Apr 17 16:40:43.053379 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.053190 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file" (OuterVolumeSpecName: "config-file") pod "7d04f860-b0e0-4352-8208-3854fac6c2b4" (UID: "7d04f860-b0e0-4352-8208-3854fac6c2b4"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:40:43.055086 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.055047 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7" (OuterVolumeSpecName: "kube-api-access-mjvz7") pod "7d04f860-b0e0-4352-8208-3854fac6c2b4" (UID: "7d04f860-b0e0-4352-8208-3854fac6c2b4"). InnerVolumeSpecName "kube-api-access-mjvz7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:40:43.154277 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.154223 2561 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7d04f860-b0e0-4352-8208-3854fac6c2b4-config-file\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:40:43.154277 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.154271 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjvz7\" (UniqueName: \"kubernetes.io/projected/7d04f860-b0e0-4352-8208-3854fac6c2b4-kube-api-access-mjvz7\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:40:43.252871 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.252836 2561 generic.go:358] "Generic (PLEG): container finished" podID="7d04f860-b0e0-4352-8208-3854fac6c2b4" containerID="1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e" exitCode=0 Apr 17 16:40:43.253059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.252887 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" event={"ID":"7d04f860-b0e0-4352-8208-3854fac6c2b4","Type":"ContainerDied","Data":"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e"} Apr 17 16:40:43.253059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.252905 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" Apr 17 16:40:43.253059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.252913 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-hvd7s" event={"ID":"7d04f860-b0e0-4352-8208-3854fac6c2b4","Type":"ContainerDied","Data":"48fb930566d1cf850defb3108e8e67b4971215ff0afba173735ce57644e9d2f2"} Apr 17 16:40:43.253059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.252928 2561 scope.go:117] "RemoveContainer" containerID="1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e" Apr 17 16:40:43.261320 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.261299 2561 scope.go:117] "RemoveContainer" containerID="1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e" Apr 17 16:40:43.261614 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:40:43.261590 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e\": container with ID starting with 1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e not found: ID does not exist" containerID="1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e" Apr 17 16:40:43.261679 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.261628 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e"} err="failed to get container status \"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e\": rpc error: code = NotFound desc = could not find container \"1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e\": container with ID starting with 1a4ffbd9d7529e23604dbc90f517da601fac8e7c174dafa8948c9ddd1e769e5e not found: ID does not exist" Apr 17 16:40:43.275219 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.275187 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:43.279817 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:43.279791 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-hvd7s"] Apr 17 16:40:44.522279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:44.522236 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d04f860-b0e0-4352-8208-3854fac6c2b4" path="/var/lib/kubelet/pods/7d04f860-b0e0-4352-8208-3854fac6c2b4/volumes" Apr 17 16:40:50.626862 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.626830 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-7xpg5"] Apr 17 16:40:50.627262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.627131 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d04f860-b0e0-4352-8208-3854fac6c2b4" containerName="limitador" Apr 17 16:40:50.627262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.627143 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d04f860-b0e0-4352-8208-3854fac6c2b4" containerName="limitador" Apr 17 16:40:50.627262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.627187 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d04f860-b0e0-4352-8208-3854fac6c2b4" containerName="limitador" Apr 17 16:40:50.631193 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.631175 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.638644 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.634134 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 16:40:50.638644 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.634474 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-cld7r\"" Apr 17 16:40:50.638644 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.638191 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-7xpg5"] Apr 17 16:40:50.712642 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.712603 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjgk\" (UniqueName: \"kubernetes.io/projected/f0abb617-8d1d-497b-a1e3-af010c9b5798-kube-api-access-kwjgk\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.712810 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.712654 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f0abb617-8d1d-497b-a1e3-af010c9b5798-tls-cert\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.813630 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.813590 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjgk\" (UniqueName: \"kubernetes.io/projected/f0abb617-8d1d-497b-a1e3-af010c9b5798-kube-api-access-kwjgk\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.813630 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.813635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f0abb617-8d1d-497b-a1e3-af010c9b5798-tls-cert\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.816008 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.815975 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f0abb617-8d1d-497b-a1e3-af010c9b5798-tls-cert\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.821853 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.821830 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjgk\" (UniqueName: \"kubernetes.io/projected/f0abb617-8d1d-497b-a1e3-af010c9b5798-kube-api-access-kwjgk\") pod \"authorino-68bd676465-7xpg5\" (UID: \"f0abb617-8d1d-497b-a1e3-af010c9b5798\") " pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:50.945948 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:50.945867 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-7xpg5" Apr 17 16:40:51.064686 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:51.064658 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-7xpg5"] Apr 17 16:40:51.067375 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:40:51.067343 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0abb617_8d1d_497b_a1e3_af010c9b5798.slice/crio-eb3c37b1076617818d667f70fd0c0fa6e63f47af3411ef79d1b570fea87c4f73 WatchSource:0}: Error finding container eb3c37b1076617818d667f70fd0c0fa6e63f47af3411ef79d1b570fea87c4f73: Status 404 returned error can't find the container with id eb3c37b1076617818d667f70fd0c0fa6e63f47af3411ef79d1b570fea87c4f73 Apr 17 16:40:51.279613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:51.279528 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-7xpg5" event={"ID":"f0abb617-8d1d-497b-a1e3-af010c9b5798","Type":"ContainerStarted","Data":"eb3c37b1076617818d667f70fd0c0fa6e63f47af3411ef79d1b570fea87c4f73"} Apr 17 16:40:54.293727 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:54.293689 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-7xpg5" event={"ID":"f0abb617-8d1d-497b-a1e3-af010c9b5798","Type":"ContainerStarted","Data":"0fd8c25ab5f7642d912c9f6e495320a10ccb4430a6b107ffced2670e58adf58f"} Apr 17 16:40:54.309755 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:40:54.309695 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-7xpg5" podStartSLOduration=1.849294422 podStartE2EDuration="4.309677722s" podCreationTimestamp="2026-04-17 16:40:50 +0000 UTC" firstStartedPulling="2026-04-17 16:40:51.068613042 +0000 UTC m=+598.966374474" lastFinishedPulling="2026-04-17 16:40:53.528996326 +0000 UTC m=+601.426757774" observedRunningTime="2026-04-17 16:40:54.308444583 +0000 UTC m=+602.206206041" watchObservedRunningTime="2026-04-17 16:40:54.309677722 +0000 UTC m=+602.207439178" Apr 17 16:41:00.841306 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.841220 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849"] Apr 17 16:41:00.844573 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.844554 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.855751 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.855724 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849"] Apr 17 16:41:00.889637 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889590 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889814 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889645 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vghx9\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-kube-api-access-vghx9\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889814 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889725 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889814 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889777 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889833 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889871 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.889928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.889898 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991214 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991184 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991214 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991217 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991247 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991270 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vghx9\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-kube-api-access-vghx9\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991303 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991320 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.991459 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.991348 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.992147 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.992116 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.993678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.993652 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.993774 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.993723 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.993774 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.993751 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:00.993843 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:00.993780 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:01.000510 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.000485 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:01.001222 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.001202 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vghx9\" (UniqueName: \"kubernetes.io/projected/9df89ec2-ebe9-4433-b760-9c3d2994fe9c-kube-api-access-vghx9\") pod \"istiod-openshift-gateway-55ff986f96-k5849\" (UID: \"9df89ec2-ebe9-4433-b760-9c3d2994fe9c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:01.154215 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.154121 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:01.328279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.328242 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849"] Apr 17 16:41:01.331255 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:41:01.331226 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df89ec2_ebe9_4433_b760_9c3d2994fe9c.slice/crio-551df69135e12b6ee056641209918f917743e9196c25fae71bb77066d6f518c9 WatchSource:0}: Error finding container 551df69135e12b6ee056641209918f917743e9196c25fae71bb77066d6f518c9: Status 404 returned error can't find the container with id 551df69135e12b6ee056641209918f917743e9196c25fae71bb77066d6f518c9 Apr 17 16:41:01.333456 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.333420 2561 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:41:01.333527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:01.333496 2561 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 16:41:02.320567 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.320525 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" event={"ID":"9df89ec2-ebe9-4433-b760-9c3d2994fe9c","Type":"ContainerStarted","Data":"711caf41e93f9cd4a514511d7a1d5024660e7e27ed1e147fa927cd0f951463c4"} Apr 17 16:41:02.320567 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.320572 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" event={"ID":"9df89ec2-ebe9-4433-b760-9c3d2994fe9c","Type":"ContainerStarted","Data":"551df69135e12b6ee056641209918f917743e9196c25fae71bb77066d6f518c9"} Apr 17 16:41:02.321152 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.320672 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:02.322334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.322312 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" Apr 17 16:41:02.349889 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.349836 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-k5849" podStartSLOduration=2.3498212880000002 podStartE2EDuration="2.349821288s" podCreationTimestamp="2026-04-17 16:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:41:02.346189949 +0000 UTC m=+610.243951404" watchObservedRunningTime="2026-04-17 16:41:02.349821288 +0000 UTC m=+610.247582742" Apr 17 16:41:02.535695 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.535653 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:41:02.535958 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.535933 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" podUID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" containerName="discovery" containerID="cri-o://9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a" gracePeriod=30 Apr 17 16:41:02.796880 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.796854 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:41:02.909943 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.909845 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.909943 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.909915 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.909948 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.910018 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n52nv\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.910040 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.910063 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910187 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.910102 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts\") pod \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\" (UID: \"1001307d-8d60-4e38-9e36-6e5f54d26cc4\") " Apr 17 16:41:02.910715 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.910675 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:41:02.912646 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912607 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv" (OuterVolumeSpecName: "kube-api-access-n52nv") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "kube-api-access-n52nv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:02.912773 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912646 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token" (OuterVolumeSpecName: "istio-token") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:41:02.912773 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912744 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs" (OuterVolumeSpecName: "local-certs") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:41:02.912860 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912826 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:02.912941 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912916 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:02.912941 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:02.912932 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts" (OuterVolumeSpecName: "cacerts") pod "1001307d-8d60-4e38-9e36-6e5f54d26cc4" (UID: "1001307d-8d60-4e38-9e36-6e5f54d26cc4"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:41:03.011640 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011604 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n52nv\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-kube-api-access-n52nv\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011640 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011633 2561 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-kubeconfig\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011640 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011643 2561 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-token\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011640 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011651 2561 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-cacerts\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011660 2561 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/1001307d-8d60-4e38-9e36-6e5f54d26cc4-local-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011669 2561 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-dns-cert\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.011900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.011678 2561 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/1001307d-8d60-4e38-9e36-6e5f54d26cc4-istio-csr-ca-configmap\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:41:03.324756 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.324716 2561 generic.go:358] "Generic (PLEG): container finished" podID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" containerID="9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a" exitCode=0 Apr 17 16:41:03.325236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.324780 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" Apr 17 16:41:03.325236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.324800 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" event={"ID":"1001307d-8d60-4e38-9e36-6e5f54d26cc4","Type":"ContainerDied","Data":"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a"} Apr 17 16:41:03.325236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.324839 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc" event={"ID":"1001307d-8d60-4e38-9e36-6e5f54d26cc4","Type":"ContainerDied","Data":"6d41f52efea34b85e375a6fa51a4c185460812f91f8458f9705a9eb6670dd0e5"} Apr 17 16:41:03.325236 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.324857 2561 scope.go:117] "RemoveContainer" containerID="9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a" Apr 17 16:41:03.333605 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.333587 2561 scope.go:117] "RemoveContainer" containerID="9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a" Apr 17 16:41:03.333871 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:41:03.333851 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a\": container with ID starting with 9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a not found: ID does not exist" containerID="9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a" Apr 17 16:41:03.333920 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.333880 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a"} err="failed to get container status \"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a\": rpc error: code = NotFound desc = could not find container \"9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a\": container with ID starting with 9578ab72edbdf8bfac65497163af56d1c5bb26af92ddd5e054131ecae9f4745a not found: ID does not exist" Apr 17 16:41:03.359537 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.359494 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:41:03.367935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:03.367903 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-dv6wc"] Apr 17 16:41:04.522728 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:04.522693 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" path="/var/lib/kubelet/pods/1001307d-8d60-4e38-9e36-6e5f54d26cc4/volumes" Apr 17 16:41:10.394983 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.394949 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:41:10.395442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.395238 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" containerName="discovery" Apr 17 16:41:10.395442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.395248 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" containerName="discovery" Apr 17 16:41:10.395442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.395302 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="1001307d-8d60-4e38-9e36-6e5f54d26cc4" containerName="discovery" Apr 17 16:41:10.411877 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.411850 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:41:10.412054 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.411959 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:10.414390 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.414337 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 16:41:10.414526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.414408 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:41:10.415180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.415159 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:41:10.415315 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.415191 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-5kc6x\"" Apr 17 16:41:10.465859 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.465825 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:10.466020 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.465898 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtch\" (UniqueName: \"kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:10.566670 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.566635 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:10.566853 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.566715 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtch\" (UniqueName: \"kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:10.566853 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:41:10.566829 2561 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 17 16:41:10.566973 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:41:10.566939 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert podName:3753367d-0233-4668-8ad8-67cf282aa3c3 nodeName:}" failed. No retries permitted until 2026-04-17 16:41:11.066914364 +0000 UTC m=+618.964675819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert") pod "llmisvc-controller-manager-868cc5c7bb-m6bfc" (UID: "3753367d-0233-4668-8ad8-67cf282aa3c3") : secret "llmisvc-webhook-server-cert" not found Apr 17 16:41:10.575879 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:10.575852 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtch\" (UniqueName: \"kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:11.070354 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:11.070305 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:11.072669 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:11.072638 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") pod \"llmisvc-controller-manager-868cc5c7bb-m6bfc\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:11.323646 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:11.323564 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:11.441491 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:11.441392 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:41:11.444317 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:41:11.444285 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3753367d_0233_4668_8ad8_67cf282aa3c3.slice/crio-f76e74c1f5722ce582dc08819967802daa767e410ea30164abdb8dd5b92942d3 WatchSource:0}: Error finding container f76e74c1f5722ce582dc08819967802daa767e410ea30164abdb8dd5b92942d3: Status 404 returned error can't find the container with id f76e74c1f5722ce582dc08819967802daa767e410ea30164abdb8dd5b92942d3 Apr 17 16:41:12.353628 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:12.353591 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" event={"ID":"3753367d-0233-4668-8ad8-67cf282aa3c3","Type":"ContainerStarted","Data":"f76e74c1f5722ce582dc08819967802daa767e410ea30164abdb8dd5b92942d3"} Apr 17 16:41:15.365494 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:15.365456 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" event={"ID":"3753367d-0233-4668-8ad8-67cf282aa3c3","Type":"ContainerStarted","Data":"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340"} Apr 17 16:41:15.365900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:15.365600 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:41:15.383897 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:15.383836 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" podStartSLOduration=2.303494985 podStartE2EDuration="5.38381952s" podCreationTimestamp="2026-04-17 16:41:10 +0000 UTC" firstStartedPulling="2026-04-17 16:41:11.446479039 +0000 UTC m=+619.344240480" lastFinishedPulling="2026-04-17 16:41:14.526803582 +0000 UTC m=+622.424565015" observedRunningTime="2026-04-17 16:41:15.380169729 +0000 UTC m=+623.277931321" watchObservedRunningTime="2026-04-17 16:41:15.38381952 +0000 UTC m=+623.281580977" Apr 17 16:41:46.370806 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:41:46.370771 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:42:21.039594 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.039558 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-pcbrh"] Apr 17 16:42:21.042699 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.042680 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.044993 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.044973 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 16:42:21.045135 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.045033 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-lp475\"" Apr 17 16:42:21.053564 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.053542 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-pcbrh"] Apr 17 16:42:21.117654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.117625 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49s6b\" (UniqueName: \"kubernetes.io/projected/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-kube-api-access-49s6b\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.117808 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.117658 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.218989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.218952 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49s6b\" (UniqueName: \"kubernetes.io/projected/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-kube-api-access-49s6b\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.218989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.218991 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.219218 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:42:21.219111 2561 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 16:42:21.219218 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:42:21.219190 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs podName:6b89a88d-91b8-4e59-adb8-3d132ceef8a1 nodeName:}" failed. No retries permitted until 2026-04-17 16:42:21.719170314 +0000 UTC m=+689.616931750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs") pod "model-serving-api-86f7b4b499-pcbrh" (UID: "6b89a88d-91b8-4e59-adb8-3d132ceef8a1") : secret "model-serving-api-tls" not found Apr 17 16:42:21.231089 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.231054 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49s6b\" (UniqueName: \"kubernetes.io/projected/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-kube-api-access-49s6b\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.724264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.724223 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.726599 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.726569 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b89a88d-91b8-4e59-adb8-3d132ceef8a1-tls-certs\") pod \"model-serving-api-86f7b4b499-pcbrh\" (UID: \"6b89a88d-91b8-4e59-adb8-3d132ceef8a1\") " pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:21.953776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:21.953737 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:22.073760 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:22.073734 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-pcbrh"] Apr 17 16:42:22.076198 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:42:22.076162 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b89a88d_91b8_4e59_adb8_3d132ceef8a1.slice/crio-7d6e24ea6fbb9a5ba5eeb4a775a9ce959c281ec0505889ead12ebde0030358da WatchSource:0}: Error finding container 7d6e24ea6fbb9a5ba5eeb4a775a9ce959c281ec0505889ead12ebde0030358da: Status 404 returned error can't find the container with id 7d6e24ea6fbb9a5ba5eeb4a775a9ce959c281ec0505889ead12ebde0030358da Apr 17 16:42:22.077993 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:22.077978 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:42:22.579281 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:22.579244 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-pcbrh" event={"ID":"6b89a88d-91b8-4e59-adb8-3d132ceef8a1","Type":"ContainerStarted","Data":"7d6e24ea6fbb9a5ba5eeb4a775a9ce959c281ec0505889ead12ebde0030358da"} Apr 17 16:42:24.587876 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:24.587837 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-pcbrh" event={"ID":"6b89a88d-91b8-4e59-adb8-3d132ceef8a1","Type":"ContainerStarted","Data":"6b5acc025f6ec5c15b1e55297823a8daee70b64e3b888354c97dd4addab653aa"} Apr 17 16:42:24.588283 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:24.587954 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:42:24.604132 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:24.604063 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-pcbrh" podStartSLOduration=1.385373342 podStartE2EDuration="3.604048865s" podCreationTimestamp="2026-04-17 16:42:21 +0000 UTC" firstStartedPulling="2026-04-17 16:42:22.078132676 +0000 UTC m=+689.975894109" lastFinishedPulling="2026-04-17 16:42:24.296808199 +0000 UTC m=+692.194569632" observedRunningTime="2026-04-17 16:42:24.603049589 +0000 UTC m=+692.500811044" watchObservedRunningTime="2026-04-17 16:42:24.604048865 +0000 UTC m=+692.501810339" Apr 17 16:42:35.597408 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:42:35.597324 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-pcbrh" Apr 17 16:43:22.157333 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.157294 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:43:22.160682 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.160664 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.164104 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.164057 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:43:22.164671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.164290 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-ktlcc\"" Apr 17 16:43:22.164671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.164223 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:43:22.164671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.164180 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:43:22.164671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.164242 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 17 16:43:22.172577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.172544 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:43:22.205899 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.205863 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-659j6\" (UniqueName: \"kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.206094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.205957 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.206094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.206008 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.206229 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.206067 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.206229 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.206136 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.206229 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.206180 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307213 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307265 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307337 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307376 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307425 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-659j6\" (UniqueName: \"kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307713 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307685 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307776 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307699 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.307872 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307847 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.308005 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.307983 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.310036 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.310008 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.315856 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.315835 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-659j6\" (UniqueName: \"kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.472208 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.472124 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:22.601255 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.601215 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:43:22.603751 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:43:22.603727 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5b5a86_e2bb_4f62_8559_f73a068d3cc3.slice/crio-8ed97158748effdc4d2ebfabe19a68c1ad26f29b3d69b8675034e70e6e841df7 WatchSource:0}: Error finding container 8ed97158748effdc4d2ebfabe19a68c1ad26f29b3d69b8675034e70e6e841df7: Status 404 returned error can't find the container with id 8ed97158748effdc4d2ebfabe19a68c1ad26f29b3d69b8675034e70e6e841df7 Apr 17 16:43:22.778926 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:22.778839 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerStarted","Data":"8ed97158748effdc4d2ebfabe19a68c1ad26f29b3d69b8675034e70e6e841df7"} Apr 17 16:43:25.794713 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:25.794631 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerStarted","Data":"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8"} Apr 17 16:43:26.800711 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:26.800674 2561 generic.go:358] "Generic (PLEG): container finished" podID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerID="d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8" exitCode=0 Apr 17 16:43:26.801157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:26.800731 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerDied","Data":"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8"} Apr 17 16:43:28.809701 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:28.809656 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerStarted","Data":"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679"} Apr 17 16:43:58.920589 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:58.920555 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerStarted","Data":"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7"} Apr 17 16:43:58.921048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:58.920797 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:58.923492 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:58.923470 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:43:58.942413 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:43:58.942357 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" podStartSLOduration=1.5520018740000001 podStartE2EDuration="36.942340618s" podCreationTimestamp="2026-04-17 16:43:22 +0000 UTC" firstStartedPulling="2026-04-17 16:43:22.605877126 +0000 UTC m=+750.503638564" lastFinishedPulling="2026-04-17 16:43:57.996215872 +0000 UTC m=+785.893977308" observedRunningTime="2026-04-17 16:43:58.940242421 +0000 UTC m=+786.838003876" watchObservedRunningTime="2026-04-17 16:43:58.942340618 +0000 UTC m=+786.840102074" Apr 17 16:44:02.473143 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:02.473018 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:02.473143 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:02.473109 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:12.475116 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:12.475058 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:12.476294 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:12.476273 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:13.706039 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:13.706003 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:44:13.967443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:13.967356 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="tokenizer" containerID="cri-o://a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7" gracePeriod=30 Apr 17 16:44:13.967606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:13.967345 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="main" containerID="cri-o://b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679" gracePeriod=30 Apr 17 16:44:14.972257 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:14.972224 2561 generic.go:358] "Generic (PLEG): container finished" podID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerID="b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679" exitCode=0 Apr 17 16:44:14.972618 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:14.972295 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerDied","Data":"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679"} Apr 17 16:44:15.307393 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.307368 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:15.374132 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374088 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374145 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-659j6\" (UniqueName: \"kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374169 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374204 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374221 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374334 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374249 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache\") pod \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\" (UID: \"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3\") " Apr 17 16:44:15.374593 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374490 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:15.374593 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374552 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:15.374671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374632 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:15.374937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.374911 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:44:15.376530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.376506 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6" (OuterVolumeSpecName: "kube-api-access-659j6") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "kube-api-access-659j6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:44:15.376603 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.376536 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" (UID: "2d5b5a86-e2bb-4f62-8559-f73a068d3cc3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:44:15.475581 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475541 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.475581 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475576 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-659j6\" (UniqueName: \"kubernetes.io/projected/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kube-api-access-659j6\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.475581 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475586 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.475799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475597 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.475799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475607 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.475799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.475617 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:44:15.977365 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.977326 2561 generic.go:358] "Generic (PLEG): container finished" podID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerID="a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7" exitCode=0 Apr 17 16:44:15.977839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.977409 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerDied","Data":"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7"} Apr 17 16:44:15.977839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.977418 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" Apr 17 16:44:15.977839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.977431 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx" event={"ID":"2d5b5a86-e2bb-4f62-8559-f73a068d3cc3","Type":"ContainerDied","Data":"8ed97158748effdc4d2ebfabe19a68c1ad26f29b3d69b8675034e70e6e841df7"} Apr 17 16:44:15.977839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.977447 2561 scope.go:117] "RemoveContainer" containerID="a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7" Apr 17 16:44:15.986291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.986260 2561 scope.go:117] "RemoveContainer" containerID="b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679" Apr 17 16:44:15.993780 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.993763 2561 scope.go:117] "RemoveContainer" containerID="d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8" Apr 17 16:44:15.998605 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:15.998571 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:44:16.001969 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.001948 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-74c54wswzx"] Apr 17 16:44:16.002032 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.001981 2561 scope.go:117] "RemoveContainer" containerID="a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7" Apr 17 16:44:16.002434 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:44:16.002414 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7\": container with ID starting with a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7 not found: ID does not exist" containerID="a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7" Apr 17 16:44:16.002518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.002450 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7"} err="failed to get container status \"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7\": rpc error: code = NotFound desc = could not find container \"a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7\": container with ID starting with a9e14bf184ef3a63e327fafc6637406e852709c6c6db6be3eef52d9604a3a8b7 not found: ID does not exist" Apr 17 16:44:16.002518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.002479 2561 scope.go:117] "RemoveContainer" containerID="b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679" Apr 17 16:44:16.002789 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:44:16.002771 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679\": container with ID starting with b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679 not found: ID does not exist" containerID="b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679" Apr 17 16:44:16.002827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.002795 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679"} err="failed to get container status \"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679\": rpc error: code = NotFound desc = could not find container \"b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679\": container with ID starting with b08563b783bba99b31473fddf35c179a39d26edaf88aca43d82453083a423679 not found: ID does not exist" Apr 17 16:44:16.002827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.002821 2561 scope.go:117] "RemoveContainer" containerID="d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8" Apr 17 16:44:16.003052 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:44:16.003030 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8\": container with ID starting with d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8 not found: ID does not exist" containerID="d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8" Apr 17 16:44:16.003122 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.003058 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8"} err="failed to get container status \"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8\": rpc error: code = NotFound desc = could not find container \"d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8\": container with ID starting with d372e02656985ab08c55d26b054baa93906f13e5b25af9bd5a9445be00a25bf8 not found: ID does not exist" Apr 17 16:44:16.524049 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:16.524016 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" path="/var/lib/kubelet/pods/2d5b5a86-e2bb-4f62-8559-f73a068d3cc3/volumes" Apr 17 16:44:29.206370 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206330 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206632 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="main" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206642 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="main" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206661 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="storage-initializer" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206667 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="storage-initializer" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206679 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="tokenizer" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206685 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="tokenizer" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206730 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="tokenizer" Apr 17 16:44:29.206763 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.206740 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d5b5a86-e2bb-4f62-8559-f73a068d3cc3" containerName="main" Apr 17 16:44:29.361240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.361149 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:44:29.361382 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.361332 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.364929 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.364902 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:44:29.364929 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.364902 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:44:29.365159 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.364903 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-fqv9r\"" Apr 17 16:44:29.365159 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.364903 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 16:44:29.365159 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.364915 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:44:29.492802 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.492765 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.492802 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.492803 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.493021 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.492833 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.493021 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.492912 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.493021 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.492952 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.493021 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.493004 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqh4\" (UniqueName: \"kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594348 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594598 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594372 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594598 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594402 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqh4\" (UniqueName: \"kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594598 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594448 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594598 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594465 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594777 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594841 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594800 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594876 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594835 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.594910 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.594884 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.596985 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.596961 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.602825 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.602804 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqh4\" (UniqueName: \"kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.675898 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.675810 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:29.805208 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:29.805017 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:44:29.808235 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:44:29.808206 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6deedf63_5e4e_44d1_aab9_1ace976ecfc5.slice/crio-2f40d6056a9dada8a3ac415f449debcaf9aca91397d567425caed04b71cc0eb7 WatchSource:0}: Error finding container 2f40d6056a9dada8a3ac415f449debcaf9aca91397d567425caed04b71cc0eb7: Status 404 returned error can't find the container with id 2f40d6056a9dada8a3ac415f449debcaf9aca91397d567425caed04b71cc0eb7 Apr 17 16:44:30.024795 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:30.024712 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerStarted","Data":"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0"} Apr 17 16:44:30.024795 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:30.024750 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerStarted","Data":"2f40d6056a9dada8a3ac415f449debcaf9aca91397d567425caed04b71cc0eb7"} Apr 17 16:44:31.029009 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:31.028974 2561 generic.go:358] "Generic (PLEG): container finished" podID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerID="2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0" exitCode=0 Apr 17 16:44:31.029402 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:31.029082 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerDied","Data":"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0"} Apr 17 16:44:32.035144 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:32.035109 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerStarted","Data":"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66"} Apr 17 16:44:32.035144 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:32.035145 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerStarted","Data":"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe"} Apr 17 16:44:32.035575 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:32.035238 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:32.055926 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:32.055875 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" podStartSLOduration=3.055861655 podStartE2EDuration="3.055861655s" podCreationTimestamp="2026-04-17 16:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:44:32.054466076 +0000 UTC m=+819.952227532" watchObservedRunningTime="2026-04-17 16:44:32.055861655 +0000 UTC m=+819.953623145" Apr 17 16:44:39.676866 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:39.676821 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:39.677439 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:39.676969 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:39.679600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:39.679577 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:40.062562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:40.062533 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:44:47.976934 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:47.976889 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:44:47.979486 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:47.979463 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:47.981901 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:47.981874 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-ppmhl\"" Apr 17 16:44:47.982663 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:47.982631 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 17 16:44:47.993193 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:47.993163 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:44:48.152037 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152000 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.152213 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152051 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7sr\" (UniqueName: \"kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.152213 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152101 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.152213 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152187 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.152332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152242 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.152332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.152289 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253603 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253514 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253603 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253571 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7sr\" (UniqueName: \"kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253610 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253744 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253801 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253986 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253845 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.253986 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.253962 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.254110 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.254023 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.254172 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.254144 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.254325 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.254304 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.256613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.256574 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.267204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.267174 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7sr\" (UniqueName: \"kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.290255 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.290204 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:48.420787 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:48.420749 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:44:48.424205 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:44:48.424171 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745f742b_e524_426a_b157_8f644975b283.slice/crio-79f191f0e6fc2adf557365fd2f0cddc1f8031d411d75ed7b40f007e736c02d3d WatchSource:0}: Error finding container 79f191f0e6fc2adf557365fd2f0cddc1f8031d411d75ed7b40f007e736c02d3d: Status 404 returned error can't find the container with id 79f191f0e6fc2adf557365fd2f0cddc1f8031d411d75ed7b40f007e736c02d3d Apr 17 16:44:49.090463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:49.090423 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerStarted","Data":"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d"} Apr 17 16:44:49.090463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:49.090465 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerStarted","Data":"79f191f0e6fc2adf557365fd2f0cddc1f8031d411d75ed7b40f007e736c02d3d"} Apr 17 16:44:50.095710 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:50.095672 2561 generic.go:358] "Generic (PLEG): container finished" podID="745f742b-e524-426a-b157-8f644975b283" containerID="6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d" exitCode=0 Apr 17 16:44:50.096094 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:50.095721 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerDied","Data":"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d"} Apr 17 16:44:51.101706 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:51.101676 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerStarted","Data":"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f"} Apr 17 16:44:51.101706 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:51.101710 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerStarted","Data":"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8"} Apr 17 16:44:51.102130 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:51.101851 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:51.121799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:51.121752 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" podStartSLOduration=4.12173795 podStartE2EDuration="4.12173795s" podCreationTimestamp="2026-04-17 16:44:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:44:51.119752032 +0000 UTC m=+839.017513487" watchObservedRunningTime="2026-04-17 16:44:51.12173795 +0000 UTC m=+839.019499405" Apr 17 16:44:58.290980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:58.290939 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:58.290980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:58.290986 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:58.293649 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:58.293625 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:44:59.137952 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:44:59.137917 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:45:02.068522 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:02.068489 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:45:03.270389 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:03.270354 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:45:03.270785 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:03.270710 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="tokenizer" containerID="cri-o://19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66" gracePeriod=30 Apr 17 16:45:03.270868 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:03.270837 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="main" containerID="cri-o://ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe" gracePeriod=30 Apr 17 16:45:04.157577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.157543 2561 generic.go:358] "Generic (PLEG): container finished" podID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerID="ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe" exitCode=0 Apr 17 16:45:04.157846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.157606 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerDied","Data":"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe"} Apr 17 16:45:04.610807 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.610779 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:45:04.695007 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.694908 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695007 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.694947 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695104 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695139 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695171 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695191 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqh4\" (UniqueName: \"kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4\") pod \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\" (UID: \"6deedf63-5e4e-44d1-aab9-1ace976ecfc5\") " Apr 17 16:45:04.695273 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695248 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:04.695502 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695409 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:04.695502 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695419 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:04.695502 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695438 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:04.695829 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.695803 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:04.697350 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.697330 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:04.697528 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.697506 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4" (OuterVolumeSpecName: "kube-api-access-6mqh4") pod "6deedf63-5e4e-44d1-aab9-1ace976ecfc5" (UID: "6deedf63-5e4e-44d1-aab9-1ace976ecfc5"). InnerVolumeSpecName "kube-api-access-6mqh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:04.795980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.795943 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:04.795980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.795975 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:04.795980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.795988 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:04.796250 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.796000 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6mqh4\" (UniqueName: \"kubernetes.io/projected/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-kube-api-access-6mqh4\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:04.796250 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:04.796012 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6deedf63-5e4e-44d1-aab9-1ace976ecfc5-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:05.162312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.162280 2561 generic.go:358] "Generic (PLEG): container finished" podID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerID="19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66" exitCode=0 Apr 17 16:45:05.162512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.162372 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" Apr 17 16:45:05.162512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.162371 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerDied","Data":"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66"} Apr 17 16:45:05.162512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.162409 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt" event={"ID":"6deedf63-5e4e-44d1-aab9-1ace976ecfc5","Type":"ContainerDied","Data":"2f40d6056a9dada8a3ac415f449debcaf9aca91397d567425caed04b71cc0eb7"} Apr 17 16:45:05.162512 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.162425 2561 scope.go:117] "RemoveContainer" containerID="19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66" Apr 17 16:45:05.170728 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.170704 2561 scope.go:117] "RemoveContainer" containerID="ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe" Apr 17 16:45:05.177853 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.177832 2561 scope.go:117] "RemoveContainer" containerID="2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0" Apr 17 16:45:05.183583 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.183558 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:45:05.187141 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187112 2561 scope.go:117] "RemoveContainer" containerID="19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66" Apr 17 16:45:05.187453 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:05.187433 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66\": container with ID starting with 19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66 not found: ID does not exist" containerID="19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66" Apr 17 16:45:05.187568 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187467 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66"} err="failed to get container status \"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66\": rpc error: code = NotFound desc = could not find container \"19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66\": container with ID starting with 19271099602a8d871ef6b54cd331ee158b5f238c37459a197141ee7d64546c66 not found: ID does not exist" Apr 17 16:45:05.187568 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187495 2561 scope.go:117] "RemoveContainer" containerID="ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe" Apr 17 16:45:05.187900 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:05.187774 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe\": container with ID starting with ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe not found: ID does not exist" containerID="ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe" Apr 17 16:45:05.187900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187804 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe"} err="failed to get container status \"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe\": rpc error: code = NotFound desc = could not find container \"ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe\": container with ID starting with ea496a9471d4a2134f05595824d1293316deda941f08992ddb1740c1afc708fe not found: ID does not exist" Apr 17 16:45:05.187900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187827 2561 scope.go:117] "RemoveContainer" containerID="2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0" Apr 17 16:45:05.188098 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.187942 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-6d578c6s2rvt"] Apr 17 16:45:05.188261 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:05.188236 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0\": container with ID starting with 2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0 not found: ID does not exist" containerID="2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0" Apr 17 16:45:05.188341 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:05.188271 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0"} err="failed to get container status \"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0\": rpc error: code = NotFound desc = could not find container \"2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0\": container with ID starting with 2cbca80b34062052619f7bd8f347f1f22a4b3aea7f4925bf274b3e1ff71912c0 not found: ID does not exist" Apr 17 16:45:06.524107 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:06.524053 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" path="/var/lib/kubelet/pods/6deedf63-5e4e-44d1-aab9-1ace976ecfc5/volumes" Apr 17 16:45:09.575203 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575172 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:09.575680 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575660 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="main" Apr 17 16:45:09.575768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575684 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="main" Apr 17 16:45:09.575768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575702 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="storage-initializer" Apr 17 16:45:09.575768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575710 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="storage-initializer" Apr 17 16:45:09.575768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575721 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="tokenizer" Apr 17 16:45:09.575768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575731 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="tokenizer" Apr 17 16:45:09.576000 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575820 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="tokenizer" Apr 17 16:45:09.576000 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.575836 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6deedf63-5e4e-44d1-aab9-1ace976ecfc5" containerName="main" Apr 17 16:45:09.578953 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.578933 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.581223 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.581200 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 17 16:45:09.589114 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.589051 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:09.636527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636495 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.636527 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636534 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.636754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636554 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.636754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636609 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.636754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636643 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqb2\" (UniqueName: \"kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.636754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.636697 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737450 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737493 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737746 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737514 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737746 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737551 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737746 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737578 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqb2\" (UniqueName: \"kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.737746 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.737606 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.738048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.738024 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.738158 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.738039 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.738158 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.738054 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.739771 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.739750 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.740140 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.740121 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.745413 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.745386 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqb2\" (UniqueName: \"kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2\") pod \"precise-prefix-cache-test-kserve-7466444c7d-2wx8c\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.857131 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.857044 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:09.859617 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.859594 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.862170 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.862144 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-v9c4x\"" Apr 17 16:45:09.871161 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.871138 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:09.890248 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.890216 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:09.939915 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.939879 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.940050 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.939945 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.940050 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.939989 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2blnr\" (UniqueName: \"kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.940050 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.940028 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.940247 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.940103 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:09.940247 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:09.940130 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.018380 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.018349 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:10.022123 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:45:10.022096 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168d93e1_75e1_45f9_9389_bad14362e44f.slice/crio-648beb8665e0a458c93d4970c4b61db981e84f5afe1138d0b4b8f6ccceeddd3c WatchSource:0}: Error finding container 648beb8665e0a458c93d4970c4b61db981e84f5afe1138d0b4b8f6ccceeddd3c: Status 404 returned error can't find the container with id 648beb8665e0a458c93d4970c4b61db981e84f5afe1138d0b4b8f6ccceeddd3c Apr 17 16:45:10.040997 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.040975 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041066 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041017 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041066 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041042 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2blnr\" (UniqueName: \"kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041267 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041252 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041329 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041293 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041329 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041399 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041478 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041450 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041566 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.041674 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.041617 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.043655 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.043633 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.048237 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.048218 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2blnr\" (UniqueName: \"kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr\") pod \"precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.169238 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.169143 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:10.183189 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.183146 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerStarted","Data":"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba"} Apr 17 16:45:10.183189 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.183186 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerStarted","Data":"648beb8665e0a458c93d4970c4b61db981e84f5afe1138d0b4b8f6ccceeddd3c"} Apr 17 16:45:10.314749 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:10.314713 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:10.315853 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:45:10.315818 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602dd996_2a4f_434e_a833_f1161cb5261e.slice/crio-a39ebf8d72a6ec9587e0f65439b9b870423b0101f9be282a31b9199a556ef531 WatchSource:0}: Error finding container a39ebf8d72a6ec9587e0f65439b9b870423b0101f9be282a31b9199a556ef531: Status 404 returned error can't find the container with id a39ebf8d72a6ec9587e0f65439b9b870423b0101f9be282a31b9199a556ef531 Apr 17 16:45:11.189119 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:11.189058 2561 generic.go:358] "Generic (PLEG): container finished" podID="602dd996-2a4f-434e-a833-f1161cb5261e" containerID="c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed" exitCode=0 Apr 17 16:45:11.189560 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:11.189196 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerDied","Data":"c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed"} Apr 17 16:45:11.189560 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:11.189241 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerStarted","Data":"a39ebf8d72a6ec9587e0f65439b9b870423b0101f9be282a31b9199a556ef531"} Apr 17 16:45:12.194526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:12.194491 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerStarted","Data":"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702"} Apr 17 16:45:12.194526 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:12.194527 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerStarted","Data":"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d"} Apr 17 16:45:12.194950 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:12.194644 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:12.217523 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:12.217475 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" podStartSLOduration=3.21745782 podStartE2EDuration="3.21745782s" podCreationTimestamp="2026-04-17 16:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:45:12.214823404 +0000 UTC m=+860.112584861" watchObservedRunningTime="2026-04-17 16:45:12.21745782 +0000 UTC m=+860.115219275" Apr 17 16:45:14.203469 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:14.203434 2561 generic.go:358] "Generic (PLEG): container finished" podID="168d93e1-75e1-45f9-9389-bad14362e44f" containerID="8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba" exitCode=0 Apr 17 16:45:14.203948 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:14.203515 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerDied","Data":"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba"} Apr 17 16:45:16.216510 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:16.216474 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerStarted","Data":"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247"} Apr 17 16:45:16.243069 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:16.243005 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" podStartSLOduration=6.124415316 podStartE2EDuration="7.242985244s" podCreationTimestamp="2026-04-17 16:45:09 +0000 UTC" firstStartedPulling="2026-04-17 16:45:14.204536056 +0000 UTC m=+862.102297488" lastFinishedPulling="2026-04-17 16:45:15.323105965 +0000 UTC m=+863.220867416" observedRunningTime="2026-04-17 16:45:16.238878244 +0000 UTC m=+864.136639689" watchObservedRunningTime="2026-04-17 16:45:16.242985244 +0000 UTC m=+864.140746703" Apr 17 16:45:19.890423 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:19.890382 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:19.890423 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:19.890432 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:19.903018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:19.902983 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:20.141860 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.141772 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:45:20.169495 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.169463 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:20.169495 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.169503 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:20.170734 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:45:20.170709 2561 logging.go:55] [core] [Channel #53 SubChannel #54]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 17 16:45:20.172214 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.172187 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:20.229680 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.229649 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:20.239834 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:20.239808 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:21.170446 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:21.170399 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 17 16:45:30.170766 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:45:30.170724 2561 logging.go:55] [core] [Channel #61 SubChannel #62]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 17 16:45:31.170407 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:31.170344 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 17 16:45:41.232994 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:41.232963 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:42.304744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.304711 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:42.305846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.305813 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="main" containerID="cri-o://fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247" gracePeriod=30 Apr 17 16:45:42.310893 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.310730 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:42.311177 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.311147 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" containerID="cri-o://f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d" gracePeriod=30 Apr 17 16:45:42.311410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.311251 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="tokenizer" containerID="cri-o://784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702" gracePeriod=30 Apr 17 16:45:42.563312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.563286 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:42.737576 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737541 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.737576 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737579 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vqb2\" (UniqueName: \"kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.737812 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737602 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.737812 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737665 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.737812 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737720 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.737812 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737753 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home\") pod \"168d93e1-75e1-45f9-9389-bad14362e44f\" (UID: \"168d93e1-75e1-45f9-9389-bad14362e44f\") " Apr 17 16:45:42.738016 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737826 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache" (OuterVolumeSpecName: "model-cache") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.738016 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.737989 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home" (OuterVolumeSpecName: "home") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.738154 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.738028 2561 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-model-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.739877 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.739815 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2" (OuterVolumeSpecName: "kube-api-access-5vqb2") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "kube-api-access-5vqb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:42.740011 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.739903 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:42.740011 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.739964 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm" (OuterVolumeSpecName: "dshm") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.799794 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.799736 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "168d93e1-75e1-45f9-9389-bad14362e44f" (UID: "168d93e1-75e1-45f9-9389-bad14362e44f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:42.839240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.839146 2561 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-dshm\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.839240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.839187 2561 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-home\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.839240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.839200 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5vqb2\" (UniqueName: \"kubernetes.io/projected/168d93e1-75e1-45f9-9389-bad14362e44f-kube-api-access-5vqb2\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.839240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.839211 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/168d93e1-75e1-45f9-9389-bad14362e44f-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:42.839240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:42.839221 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/168d93e1-75e1-45f9-9389-bad14362e44f-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:43.312442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.312408 2561 generic.go:358] "Generic (PLEG): container finished" podID="602dd996-2a4f-434e-a833-f1161cb5261e" containerID="f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d" exitCode=0 Apr 17 16:45:43.312864 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.312477 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerDied","Data":"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d"} Apr 17 16:45:43.313884 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.313857 2561 generic.go:358] "Generic (PLEG): container finished" podID="168d93e1-75e1-45f9-9389-bad14362e44f" containerID="fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247" exitCode=0 Apr 17 16:45:43.314044 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.313895 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerDied","Data":"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247"} Apr 17 16:45:43.314044 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.313921 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" event={"ID":"168d93e1-75e1-45f9-9389-bad14362e44f","Type":"ContainerDied","Data":"648beb8665e0a458c93d4970c4b61db981e84f5afe1138d0b4b8f6ccceeddd3c"} Apr 17 16:45:43.314044 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.313939 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c" Apr 17 16:45:43.314044 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.313940 2561 scope.go:117] "RemoveContainer" containerID="fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247" Apr 17 16:45:43.327038 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.327017 2561 scope.go:117] "RemoveContainer" containerID="8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba" Apr 17 16:45:43.335697 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.335670 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:43.337723 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.337703 2561 scope.go:117] "RemoveContainer" containerID="fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247" Apr 17 16:45:43.338174 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:43.338146 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247\": container with ID starting with fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247 not found: ID does not exist" containerID="fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247" Apr 17 16:45:43.338283 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.338187 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247"} err="failed to get container status \"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247\": rpc error: code = NotFound desc = could not find container \"fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247\": container with ID starting with fd0353e76d67c521071940c51e2fa2851777629657929b134315c2a76b483247 not found: ID does not exist" Apr 17 16:45:43.338283 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.338216 2561 scope.go:117] "RemoveContainer" containerID="8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba" Apr 17 16:45:43.338609 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:43.338582 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba\": container with ID starting with 8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba not found: ID does not exist" containerID="8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba" Apr 17 16:45:43.338708 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.338622 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba"} err="failed to get container status \"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba\": rpc error: code = NotFound desc = could not find container \"8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba\": container with ID starting with 8a0fe8e454021fe54ca49330e18e96ec6789df39f4eaedff74fea89f8f2341ba not found: ID does not exist" Apr 17 16:45:43.340753 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.340732 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-7466444c7d-2wx8c"] Apr 17 16:45:43.880265 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:43.880239 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:44.050033 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.049951 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050033 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050002 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2blnr\" (UniqueName: \"kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050033 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050030 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050319 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050113 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050319 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050133 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050319 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050166 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location\") pod \"602dd996-2a4f-434e-a833-f1161cb5261e\" (UID: \"602dd996-2a4f-434e-a833-f1161cb5261e\") " Apr 17 16:45:44.050522 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050482 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:44.050591 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050549 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:44.050591 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050577 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:44.050959 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.050936 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:45:44.052243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.052218 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr" (OuterVolumeSpecName: "kube-api-access-2blnr") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "kube-api-access-2blnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:45:44.052329 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.052276 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "602dd996-2a4f-434e-a833-f1161cb5261e" (UID: "602dd996-2a4f-434e-a833-f1161cb5261e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:45:44.150909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150872 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.150909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150902 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.150909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150911 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.151152 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150922 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/602dd996-2a4f-434e-a833-f1161cb5261e-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.151152 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150932 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/602dd996-2a4f-434e-a833-f1161cb5261e-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.151152 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.150941 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2blnr\" (UniqueName: \"kubernetes.io/projected/602dd996-2a4f-434e-a833-f1161cb5261e-kube-api-access-2blnr\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:45:44.319070 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.319038 2561 generic.go:358] "Generic (PLEG): container finished" podID="602dd996-2a4f-434e-a833-f1161cb5261e" containerID="784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702" exitCode=0 Apr 17 16:45:44.319489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.319102 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerDied","Data":"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702"} Apr 17 16:45:44.319489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.319138 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" event={"ID":"602dd996-2a4f-434e-a833-f1161cb5261e","Type":"ContainerDied","Data":"a39ebf8d72a6ec9587e0f65439b9b870423b0101f9be282a31b9199a556ef531"} Apr 17 16:45:44.319489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.319159 2561 scope.go:117] "RemoveContainer" containerID="784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702" Apr 17 16:45:44.319489 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.319173 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9" Apr 17 16:45:44.327701 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.327349 2561 scope.go:117] "RemoveContainer" containerID="f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d" Apr 17 16:45:44.334552 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.334536 2561 scope.go:117] "RemoveContainer" containerID="c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed" Apr 17 16:45:44.341606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.341572 2561 scope.go:117] "RemoveContainer" containerID="784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702" Apr 17 16:45:44.342147 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:44.342107 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702\": container with ID starting with 784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702 not found: ID does not exist" containerID="784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702" Apr 17 16:45:44.342258 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.342158 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702"} err="failed to get container status \"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702\": rpc error: code = NotFound desc = could not find container \"784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702\": container with ID starting with 784b1ea608d7ece2df0c9b032457bb7c9e1584e1385905e6e85fdba9fe386702 not found: ID does not exist" Apr 17 16:45:44.342258 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.342182 2561 scope.go:117] "RemoveContainer" containerID="f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d" Apr 17 16:45:44.342511 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:44.342487 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d\": container with ID starting with f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d not found: ID does not exist" containerID="f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d" Apr 17 16:45:44.342609 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.342518 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d"} err="failed to get container status \"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d\": rpc error: code = NotFound desc = could not find container \"f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d\": container with ID starting with f5ffffb220a25167c2de3e0deb295d5e0f20c91b6c290c468759861ea3da0e8d not found: ID does not exist" Apr 17 16:45:44.342609 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.342543 2561 scope.go:117] "RemoveContainer" containerID="c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed" Apr 17 16:45:44.342924 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:45:44.342898 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed\": container with ID starting with c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed not found: ID does not exist" containerID="c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed" Apr 17 16:45:44.343008 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.342933 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed"} err="failed to get container status \"c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed\": rpc error: code = NotFound desc = could not find container \"c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed\": container with ID starting with c3da6a999c0a539b4e7b733ff87741a52a6a8d4fca1dddcfba2def69b96d32ed not found: ID does not exist" Apr 17 16:45:44.344313 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.344293 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:44.347900 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.347882 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-6c5c64dfr5pr9"] Apr 17 16:45:44.522744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.522711 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" path="/var/lib/kubelet/pods/168d93e1-75e1-45f9-9389-bad14362e44f/volumes" Apr 17 16:45:44.523166 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:45:44.523152 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" path="/var/lib/kubelet/pods/602dd996-2a4f-434e-a833-f1161cb5261e/volumes" Apr 17 16:47:26.959174 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:26.959139 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:47:26.962942 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:26.959453 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="main" containerID="cri-o://8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8" gracePeriod=30 Apr 17 16:47:26.962942 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:26.959552 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="tokenizer" containerID="cri-o://77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f" gracePeriod=30 Apr 17 16:47:27.657178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:27.657137 2561 generic.go:358] "Generic (PLEG): container finished" podID="745f742b-e524-426a-b157-8f644975b283" containerID="8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8" exitCode=0 Apr 17 16:47:27.657365 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:27.657212 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerDied","Data":"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8"} Apr 17 16:47:28.216026 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.215999 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:47:28.282577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282488 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282529 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282598 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282624 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282644 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p7sr\" (UniqueName: \"kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282668 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache\") pod \"745f742b-e524-426a-b157-8f644975b283\" (UID: \"745f742b-e524-426a-b157-8f644975b283\") " Apr 17 16:47:28.282994 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282874 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:28.282994 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282913 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:28.282994 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.282973 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:28.283430 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.283408 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:47:28.284735 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.284715 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr" (OuterVolumeSpecName: "kube-api-access-6p7sr") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "kube-api-access-6p7sr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:47:28.284834 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.284812 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "745f742b-e524-426a-b157-8f644975b283" (UID: "745f742b-e524-426a-b157-8f644975b283"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:47:28.384123 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384060 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.384123 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384116 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/745f742b-e524-426a-b157-8f644975b283-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.384123 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384125 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.384123 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384134 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.384386 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384145 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6p7sr\" (UniqueName: \"kubernetes.io/projected/745f742b-e524-426a-b157-8f644975b283-kube-api-access-6p7sr\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.384386 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.384154 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/745f742b-e524-426a-b157-8f644975b283-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:47:28.662671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.662637 2561 generic.go:358] "Generic (PLEG): container finished" podID="745f742b-e524-426a-b157-8f644975b283" containerID="77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f" exitCode=0 Apr 17 16:47:28.662671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.662674 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerDied","Data":"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f"} Apr 17 16:47:28.662895 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.662696 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" event={"ID":"745f742b-e524-426a-b157-8f644975b283","Type":"ContainerDied","Data":"79f191f0e6fc2adf557365fd2f0cddc1f8031d411d75ed7b40f007e736c02d3d"} Apr 17 16:47:28.662895 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.662710 2561 scope.go:117] "RemoveContainer" containerID="77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f" Apr 17 16:47:28.662895 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.662717 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb" Apr 17 16:47:28.670691 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.670670 2561 scope.go:117] "RemoveContainer" containerID="8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8" Apr 17 16:47:28.677396 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.677361 2561 scope.go:117] "RemoveContainer" containerID="6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d" Apr 17 16:47:28.680034 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.679984 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:47:28.683735 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.683713 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheqzjjb"] Apr 17 16:47:28.686721 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.686697 2561 scope.go:117] "RemoveContainer" containerID="77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f" Apr 17 16:47:28.687009 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:47:28.686989 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f\": container with ID starting with 77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f not found: ID does not exist" containerID="77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f" Apr 17 16:47:28.687175 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.687022 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f"} err="failed to get container status \"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f\": rpc error: code = NotFound desc = could not find container \"77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f\": container with ID starting with 77a67c20add37b55b0658c11c8924e1453a43a9dc860c75d8603e0ec10bbfc8f not found: ID does not exist" Apr 17 16:47:28.687175 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.687053 2561 scope.go:117] "RemoveContainer" containerID="8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8" Apr 17 16:47:28.687372 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:47:28.687341 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8\": container with ID starting with 8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8 not found: ID does not exist" containerID="8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8" Apr 17 16:47:28.687419 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.687385 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8"} err="failed to get container status \"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8\": rpc error: code = NotFound desc = could not find container \"8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8\": container with ID starting with 8e6a79ee2feee450b988b0d812ef70df45e50bcb3403a629b9efad75a4463ac8 not found: ID does not exist" Apr 17 16:47:28.687419 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.687406 2561 scope.go:117] "RemoveContainer" containerID="6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d" Apr 17 16:47:28.687677 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:47:28.687659 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d\": container with ID starting with 6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d not found: ID does not exist" containerID="6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d" Apr 17 16:47:28.687735 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:28.687685 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d"} err="failed to get container status \"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d\": rpc error: code = NotFound desc = could not find container \"6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d\": container with ID starting with 6b231b90653679c53a58744a2933af4fba3bd9b9645fd9bdd6bf326b5eac540d not found: ID does not exist" Apr 17 16:47:30.522939 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.522906 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="745f742b-e524-426a-b157-8f644975b283" path="/var/lib/kubelet/pods/745f742b-e524-426a-b157-8f644975b283/volumes" Apr 17 16:47:30.859224 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859187 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:47:30.859484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859472 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="storage-initializer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859488 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="storage-initializer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859498 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="tokenizer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859504 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="tokenizer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859511 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="tokenizer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859517 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="tokenizer" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859523 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="main" Apr 17 16:47:30.859530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859529 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859536 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859541 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859549 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="storage-initializer" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859554 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="storage-initializer" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859563 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="storage-initializer" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859568 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="storage-initializer" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859576 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859581 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859629 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="168d93e1-75e1-45f9-9389-bad14362e44f" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859637 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859643 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="tokenizer" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859649 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="602dd996-2a4f-434e-a833-f1161cb5261e" containerName="main" Apr 17 16:47:30.859733 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.859654 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="745f742b-e524-426a-b157-8f644975b283" containerName="tokenizer" Apr 17 16:47:30.864470 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.864442 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:30.866680 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.866652 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-t5scr\"" Apr 17 16:47:30.867566 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.867538 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:47:30.867682 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.867562 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 17 16:47:30.867682 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.867628 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:47:30.867813 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.867636 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:47:30.875493 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:30.875467 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:47:31.008158 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008114 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.008312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008179 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zzmv\" (UniqueName: \"kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.008312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008222 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.008312 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008291 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.008438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008341 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.008438 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.008373 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109532 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109440 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109532 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109487 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109756 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109610 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109756 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109702 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109756 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109727 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zzmv\" (UniqueName: \"kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109951 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109764 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.109951 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109907 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.110030 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.109986 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.110098 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.110055 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.110253 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.110237 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.112220 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.112203 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.117661 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.117639 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zzmv\" (UniqueName: \"kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv\") pod \"custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.174599 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.174570 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:31.301389 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.301355 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:47:31.311029 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.311006 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:47:31.674449 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.674357 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerStarted","Data":"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c"} Apr 17 16:47:31.674449 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:31.674396 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerStarted","Data":"50070c42435389c4d03d2520fb2d5ada7a9062f8f808db7c57fff66e8a4bb8c9"} Apr 17 16:47:32.678964 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:32.678927 2561 generic.go:358] "Generic (PLEG): container finished" podID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerID="7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c" exitCode=0 Apr 17 16:47:32.679340 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:32.679015 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerDied","Data":"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c"} Apr 17 16:47:33.683928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:33.683894 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerStarted","Data":"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4"} Apr 17 16:47:33.683928 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:33.683929 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerStarted","Data":"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3"} Apr 17 16:47:33.684372 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:33.684061 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:33.705812 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:33.705734 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" podStartSLOduration=3.705712471 podStartE2EDuration="3.705712471s" podCreationTimestamp="2026-04-17 16:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:47:33.704988224 +0000 UTC m=+1001.602749679" watchObservedRunningTime="2026-04-17 16:47:33.705712471 +0000 UTC m=+1001.603473919" Apr 17 16:47:41.175443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:41.175399 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:41.175849 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:41.175564 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:41.178197 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:41.178171 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:47:41.712227 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:47:41.712195 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:48:03.718633 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:48:03.718601 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:49:16.708908 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:16.708701 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:49:16.709676 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:16.709627 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="main" containerID="cri-o://796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3" gracePeriod=30 Apr 17 16:49:16.709852 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:16.709788 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="tokenizer" containerID="cri-o://dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4" gracePeriod=30 Apr 17 16:49:17.029478 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:17.029395 2561 generic.go:358] "Generic (PLEG): container finished" podID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerID="796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3" exitCode=0 Apr 17 16:49:17.029617 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:17.029469 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerDied","Data":"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3"} Apr 17 16:49:17.960140 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:17.960116 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:49:18.035017 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.034934 2561 generic.go:358] "Generic (PLEG): container finished" podID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerID="dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4" exitCode=0 Apr 17 16:49:18.035017 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.034983 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerDied","Data":"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4"} Apr 17 16:49:18.035017 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.035012 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" event={"ID":"51d0362b-b1e3-4e36-99af-f63b4039a431","Type":"ContainerDied","Data":"50070c42435389c4d03d2520fb2d5ada7a9062f8f808db7c57fff66e8a4bb8c9"} Apr 17 16:49:18.035279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.035027 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v" Apr 17 16:49:18.035279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.035031 2561 scope.go:117] "RemoveContainer" containerID="dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4" Apr 17 16:49:18.042516 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.042499 2561 scope.go:117] "RemoveContainer" containerID="796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3" Apr 17 16:49:18.049523 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.049503 2561 scope.go:117] "RemoveContainer" containerID="7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c" Apr 17 16:49:18.056332 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.056310 2561 scope.go:117] "RemoveContainer" containerID="dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4" Apr 17 16:49:18.056576 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:49:18.056556 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4\": container with ID starting with dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4 not found: ID does not exist" containerID="dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4" Apr 17 16:49:18.056634 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.056586 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4"} err="failed to get container status \"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4\": rpc error: code = NotFound desc = could not find container \"dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4\": container with ID starting with dc61daeaa28a6f1b6b23f424cf661042bba7ee66582caa17cd407be9e5876ad4 not found: ID does not exist" Apr 17 16:49:18.056634 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.056605 2561 scope.go:117] "RemoveContainer" containerID="796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3" Apr 17 16:49:18.056824 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:49:18.056806 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3\": container with ID starting with 796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3 not found: ID does not exist" containerID="796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3" Apr 17 16:49:18.056887 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.056837 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3"} err="failed to get container status \"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3\": rpc error: code = NotFound desc = could not find container \"796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3\": container with ID starting with 796ef280d85d0eede8e7f3a89e9bae622df5159a2a7e8db4269fb33521145ef3 not found: ID does not exist" Apr 17 16:49:18.056887 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.056861 2561 scope.go:117] "RemoveContainer" containerID="7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c" Apr 17 16:49:18.057110 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:49:18.057091 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c\": container with ID starting with 7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c not found: ID does not exist" containerID="7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c" Apr 17 16:49:18.057190 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.057114 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c"} err="failed to get container status \"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c\": rpc error: code = NotFound desc = could not find container \"7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c\": container with ID starting with 7739d7f178be92e3305c0ffddf7506cda14a811876dfb72014373baa98a2437c not found: ID does not exist" Apr 17 16:49:18.091213 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091182 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091263 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zzmv\" (UniqueName: \"kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091292 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091330 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091347 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091373 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091367 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds\") pod \"51d0362b-b1e3-4e36-99af-f63b4039a431\" (UID: \"51d0362b-b1e3-4e36-99af-f63b4039a431\") " Apr 17 16:49:18.091600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091492 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:18.091705 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091676 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:18.091811 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.091705 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:18.092114 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.092069 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:49:18.093452 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.093429 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv" (OuterVolumeSpecName: "kube-api-access-9zzmv") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "kube-api-access-9zzmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:49:18.093531 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.093430 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "51d0362b-b1e3-4e36-99af-f63b4039a431" (UID: "51d0362b-b1e3-4e36-99af-f63b4039a431"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:49:18.192284 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192234 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.192284 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192281 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51d0362b-b1e3-4e36-99af-f63b4039a431-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.192284 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192291 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.192284 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192300 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.192556 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192308 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9zzmv\" (UniqueName: \"kubernetes.io/projected/51d0362b-b1e3-4e36-99af-f63b4039a431-kube-api-access-9zzmv\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.192556 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.192319 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51d0362b-b1e3-4e36-99af-f63b4039a431-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:49:18.365964 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.365928 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:49:18.370827 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.370797 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-75df9f66xsn7v"] Apr 17 16:49:18.522539 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:18.522504 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" path="/var/lib/kubelet/pods/51d0362b-b1e3-4e36-99af-f63b4039a431/volumes" Apr 17 16:49:40.051753 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.051715 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052022 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="tokenizer" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052033 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="tokenizer" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052043 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="storage-initializer" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052049 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="storage-initializer" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052060 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="main" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052066 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="main" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052128 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="tokenizer" Apr 17 16:49:40.052180 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.052136 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="51d0362b-b1e3-4e36-99af-f63b4039a431" containerName="main" Apr 17 16:49:40.055286 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.055265 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.057484 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.057466 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-zdk95\"" Apr 17 16:49:40.058059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.058031 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:49:40.058059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.058055 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 17 16:49:40.058259 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.058146 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:49:40.058259 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.058173 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:49:40.066673 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.066648 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:49:40.172654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172615 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.172824 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172671 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.172824 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172750 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.172824 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172780 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.172824 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172814 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.172957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.172835 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mg9\" (UniqueName: \"kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.273987 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.273947 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274165 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.273993 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mg9\" (UniqueName: \"kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274165 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274038 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274165 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274112 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274165 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274159 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274233 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274446 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274427 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274516 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274498 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274551 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274501 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.274587 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.274562 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.276676 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.276654 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.281926 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.281904 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mg9\" (UniqueName: \"kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9\") pod \"router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.364645 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.364606 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:40.489510 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:40.489486 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:49:40.492005 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:49:40.491980 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1628d5fd_7e23_4470_8418_459c3a86ddd0.slice/crio-b4b325238b8262ea705b7362ca13427427beb1568682b64a63e26c4e9b0e1873 WatchSource:0}: Error finding container b4b325238b8262ea705b7362ca13427427beb1568682b64a63e26c4e9b0e1873: Status 404 returned error can't find the container with id b4b325238b8262ea705b7362ca13427427beb1568682b64a63e26c4e9b0e1873 Apr 17 16:49:41.110761 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:41.110724 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerStarted","Data":"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6"} Apr 17 16:49:41.110761 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:41.110763 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerStarted","Data":"b4b325238b8262ea705b7362ca13427427beb1568682b64a63e26c4e9b0e1873"} Apr 17 16:49:42.115577 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:42.115542 2561 generic.go:358] "Generic (PLEG): container finished" podID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerID="6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6" exitCode=0 Apr 17 16:49:42.115967 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:42.115582 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerDied","Data":"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6"} Apr 17 16:49:43.120342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:43.120306 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerStarted","Data":"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1"} Apr 17 16:49:43.120342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:43.120345 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerStarted","Data":"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998"} Apr 17 16:49:43.120875 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:43.120435 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:43.139898 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:43.139846 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" podStartSLOduration=3.139829982 podStartE2EDuration="3.139829982s" podCreationTimestamp="2026-04-17 16:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:43.138652402 +0000 UTC m=+1131.036413898" watchObservedRunningTime="2026-04-17 16:49:43.139829982 +0000 UTC m=+1131.037591491" Apr 17 16:49:50.365620 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:50.365574 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:50.365620 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:50.365613 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:50.368348 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:50.368325 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:51.152624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:51.152592 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:49:56.895432 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:56.895395 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-6df48d4859-mht88"] Apr 17 16:49:56.900475 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:56.900453 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:56.904355 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:56.904329 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6df48d4859-mht88"] Apr 17 16:49:56.919289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:56.919260 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chl4k\" (UniqueName: \"kubernetes.io/projected/7431d8d2-7cd4-4bc5-920c-37eceb31d473-kube-api-access-chl4k\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:56.919457 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:56.919306 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7431d8d2-7cd4-4bc5-920c-37eceb31d473-cert\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.019661 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.019616 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chl4k\" (UniqueName: \"kubernetes.io/projected/7431d8d2-7cd4-4bc5-920c-37eceb31d473-kube-api-access-chl4k\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.019832 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.019672 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7431d8d2-7cd4-4bc5-920c-37eceb31d473-cert\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.021918 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.021894 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7431d8d2-7cd4-4bc5-920c-37eceb31d473-cert\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.028247 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.028216 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chl4k\" (UniqueName: \"kubernetes.io/projected/7431d8d2-7cd4-4bc5-920c-37eceb31d473-kube-api-access-chl4k\") pod \"llmisvc-controller-manager-6df48d4859-mht88\" (UID: \"7431d8d2-7cd4-4bc5-920c-37eceb31d473\") " pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.211267 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.211182 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:57.327436 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:57.327411 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-6df48d4859-mht88"] Apr 17 16:49:57.329921 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:49:57.329890 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7431d8d2_7cd4_4bc5_920c_37eceb31d473.slice/crio-95be5f7a48b2011bd20122e52622703892dd787f40481aa30a35e51f6a345be3 WatchSource:0}: Error finding container 95be5f7a48b2011bd20122e52622703892dd787f40481aa30a35e51f6a345be3: Status 404 returned error can't find the container with id 95be5f7a48b2011bd20122e52622703892dd787f40481aa30a35e51f6a345be3 Apr 17 16:49:58.185217 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:58.185174 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" event={"ID":"7431d8d2-7cd4-4bc5-920c-37eceb31d473","Type":"ContainerStarted","Data":"ac5ab2bd7bf4f92d6a27bd7b72071771d58ab71d555b19b32690753a0b933282"} Apr 17 16:49:58.185217 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:58.185216 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" event={"ID":"7431d8d2-7cd4-4bc5-920c-37eceb31d473","Type":"ContainerStarted","Data":"95be5f7a48b2011bd20122e52622703892dd787f40481aa30a35e51f6a345be3"} Apr 17 16:49:58.185632 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:58.185235 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:49:58.200184 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:49:58.200068 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" podStartSLOduration=1.590071394 podStartE2EDuration="2.200052466s" podCreationTimestamp="2026-04-17 16:49:56 +0000 UTC" firstStartedPulling="2026-04-17 16:49:57.331228746 +0000 UTC m=+1145.228990179" lastFinishedPulling="2026-04-17 16:49:57.941209804 +0000 UTC m=+1145.838971251" observedRunningTime="2026-04-17 16:49:58.199433383 +0000 UTC m=+1146.097194838" watchObservedRunningTime="2026-04-17 16:49:58.200052466 +0000 UTC m=+1146.097813917" Apr 17 16:50:12.157426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:12.157350 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:50:29.190937 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.190905 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-6df48d4859-mht88" Apr 17 16:50:29.232394 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.232362 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:50:29.232685 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.232658 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" podUID="3753367d-0233-4668-8ad8-67cf282aa3c3" containerName="manager" containerID="cri-o://06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340" gracePeriod=30 Apr 17 16:50:29.472341 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.472318 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:50:29.576371 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.576332 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtch\" (UniqueName: \"kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch\") pod \"3753367d-0233-4668-8ad8-67cf282aa3c3\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " Apr 17 16:50:29.576543 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.576413 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") pod \"3753367d-0233-4668-8ad8-67cf282aa3c3\" (UID: \"3753367d-0233-4668-8ad8-67cf282aa3c3\") " Apr 17 16:50:29.578547 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.578510 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch" (OuterVolumeSpecName: "kube-api-access-zrtch") pod "3753367d-0233-4668-8ad8-67cf282aa3c3" (UID: "3753367d-0233-4668-8ad8-67cf282aa3c3"). InnerVolumeSpecName "kube-api-access-zrtch". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:50:29.578672 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.578604 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert" (OuterVolumeSpecName: "cert") pod "3753367d-0233-4668-8ad8-67cf282aa3c3" (UID: "3753367d-0233-4668-8ad8-67cf282aa3c3"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:50:29.677139 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.677034 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrtch\" (UniqueName: \"kubernetes.io/projected/3753367d-0233-4668-8ad8-67cf282aa3c3-kube-api-access-zrtch\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:50:29.677139 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:29.677059 2561 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3753367d-0233-4668-8ad8-67cf282aa3c3-cert\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:50:30.290690 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.290650 2561 generic.go:358] "Generic (PLEG): container finished" podID="3753367d-0233-4668-8ad8-67cf282aa3c3" containerID="06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340" exitCode=0 Apr 17 16:50:30.290690 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.290691 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" event={"ID":"3753367d-0233-4668-8ad8-67cf282aa3c3","Type":"ContainerDied","Data":"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340"} Apr 17 16:50:30.291245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.290715 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" event={"ID":"3753367d-0233-4668-8ad8-67cf282aa3c3","Type":"ContainerDied","Data":"f76e74c1f5722ce582dc08819967802daa767e410ea30164abdb8dd5b92942d3"} Apr 17 16:50:30.291245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.290721 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc" Apr 17 16:50:30.291245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.290735 2561 scope.go:117] "RemoveContainer" containerID="06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340" Apr 17 16:50:30.298987 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.298972 2561 scope.go:117] "RemoveContainer" containerID="06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340" Apr 17 16:50:30.299264 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:50:30.299244 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340\": container with ID starting with 06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340 not found: ID does not exist" containerID="06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340" Apr 17 16:50:30.299315 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.299272 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340"} err="failed to get container status \"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340\": rpc error: code = NotFound desc = could not find container \"06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340\": container with ID starting with 06ce875e692b37e6a1c1c9c56cb876064e8aa0ce045e3232fafce582799b0340 not found: ID does not exist" Apr 17 16:50:30.312537 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.312510 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:50:30.314703 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.314682 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-868cc5c7bb-m6bfc"] Apr 17 16:50:30.522425 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:50:30.522392 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3753367d-0233-4668-8ad8-67cf282aa3c3" path="/var/lib/kubelet/pods/3753367d-0233-4668-8ad8-67cf282aa3c3/volumes" Apr 17 16:51:34.575105 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:34.574996 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:51:34.575600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:34.575456 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="tokenizer" containerID="cri-o://11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1" gracePeriod=30 Apr 17 16:51:34.575674 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:34.575636 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="main" containerID="cri-o://9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998" gracePeriod=30 Apr 17 16:51:35.509869 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:35.509834 2561 generic.go:358] "Generic (PLEG): container finished" podID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerID="9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998" exitCode=0 Apr 17 16:51:35.510060 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:35.509909 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerDied","Data":"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998"} Apr 17 16:51:35.695520 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:51:35.695490 2561 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1628d5fd_7e23_4470_8418_459c3a86ddd0.slice/crio-11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1.scope\": RecentStats: unable to find data in memory cache]" Apr 17 16:51:35.931142 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:35.931117 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:51:36.011144 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011108 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011159 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011189 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011226 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011262 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9mg9\" (UniqueName: \"kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011477 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011289 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location\") pod \"1628d5fd-7e23-4470-8418-459c3a86ddd0\" (UID: \"1628d5fd-7e23-4470-8418-459c3a86ddd0\") " Apr 17 16:51:36.011529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011486 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:36.011585 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011562 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:36.011721 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.011692 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:36.012179 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.012152 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:51:36.013358 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.013339 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:51:36.013418 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.013370 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9" (OuterVolumeSpecName: "kube-api-access-x9mg9") pod "1628d5fd-7e23-4470-8418-459c3a86ddd0" (UID: "1628d5fd-7e23-4470-8418-459c3a86ddd0"). InnerVolumeSpecName "kube-api-access-x9mg9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:51:36.112383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112348 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.112383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112380 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.112383 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112389 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.112611 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112398 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x9mg9\" (UniqueName: \"kubernetes.io/projected/1628d5fd-7e23-4470-8418-459c3a86ddd0-kube-api-access-x9mg9\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.112611 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112410 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1628d5fd-7e23-4470-8418-459c3a86ddd0-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.112611 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.112420 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1628d5fd-7e23-4470-8418-459c3a86ddd0-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:51:36.515111 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.514998 2561 generic.go:358] "Generic (PLEG): container finished" podID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerID="11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1" exitCode=0 Apr 17 16:51:36.515111 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.515063 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerDied","Data":"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1"} Apr 17 16:51:36.515339 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.515126 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" event={"ID":"1628d5fd-7e23-4470-8418-459c3a86ddd0","Type":"ContainerDied","Data":"b4b325238b8262ea705b7362ca13427427beb1568682b64a63e26c4e9b0e1873"} Apr 17 16:51:36.515339 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.515148 2561 scope.go:117] "RemoveContainer" containerID="11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1" Apr 17 16:51:36.515339 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.515070 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr" Apr 17 16:51:36.524214 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.524193 2561 scope.go:117] "RemoveContainer" containerID="9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998" Apr 17 16:51:36.533760 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.533742 2561 scope.go:117] "RemoveContainer" containerID="6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6" Apr 17 16:51:36.538619 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.538594 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:51:36.542060 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542038 2561 scope.go:117] "RemoveContainer" containerID="11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1" Apr 17 16:51:36.542577 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:51:36.542544 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1\": container with ID starting with 11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1 not found: ID does not exist" containerID="11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1" Apr 17 16:51:36.542659 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542580 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1"} err="failed to get container status \"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1\": rpc error: code = NotFound desc = could not find container \"11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1\": container with ID starting with 11159abc12a41fe5f76dd3526630468468b30b0f17ca447f7d44e687d51bf0c1 not found: ID does not exist" Apr 17 16:51:36.542659 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542603 2561 scope.go:117] "RemoveContainer" containerID="9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998" Apr 17 16:51:36.542873 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:51:36.542853 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998\": container with ID starting with 9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998 not found: ID does not exist" containerID="9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998" Apr 17 16:51:36.542921 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542879 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998"} err="failed to get container status \"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998\": rpc error: code = NotFound desc = could not find container \"9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998\": container with ID starting with 9fae0ae2111026447251cd45c80220e72cc0e39fa64a5a4faca142feb1b61998 not found: ID does not exist" Apr 17 16:51:36.542921 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542896 2561 scope.go:117] "RemoveContainer" containerID="6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6" Apr 17 16:51:36.542921 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.542905 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-b9d5488f7-lgrpr"] Apr 17 16:51:36.543184 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:51:36.543169 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6\": container with ID starting with 6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6 not found: ID does not exist" containerID="6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6" Apr 17 16:51:36.543241 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:36.543187 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6"} err="failed to get container status \"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6\": rpc error: code = NotFound desc = could not find container \"6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6\": container with ID starting with 6a753e51cd57b9b8263dd770d1bccca8711671132ae06dcd91ce43fc0564bcb6 not found: ID does not exist" Apr 17 16:51:38.523204 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:38.523163 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" path="/var/lib/kubelet/pods/1628d5fd-7e23-4470-8418-459c3a86ddd0/volumes" Apr 17 16:51:56.490887 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.490840 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491151 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="tokenizer" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491163 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="tokenizer" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491173 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="storage-initializer" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491179 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="storage-initializer" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491188 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3753367d-0233-4668-8ad8-67cf282aa3c3" containerName="manager" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491194 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="3753367d-0233-4668-8ad8-67cf282aa3c3" containerName="manager" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491206 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="main" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491214 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="main" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491258 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="main" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491265 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="3753367d-0233-4668-8ad8-67cf282aa3c3" containerName="manager" Apr 17 16:51:56.491291 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.491275 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="1628d5fd-7e23-4470-8418-459c3a86ddd0" containerName="tokenizer" Apr 17 16:51:56.496210 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.496183 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.499150 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.499122 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:51:56.499305 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.499156 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:51:56.499305 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.499214 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-8k9n2\"" Apr 17 16:51:56.499305 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.499122 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:51:56.499452 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.499155 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 16:51:56.504621 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.504595 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:51:56.575067 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575023 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.575264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575099 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.575264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575133 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.575264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575159 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.575264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575243 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsg7q\" (UniqueName: \"kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.575401 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.575269 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676474 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676439 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676647 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676482 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676647 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676504 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676647 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676536 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676647 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676582 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsg7q\" (UniqueName: \"kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676647 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676608 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676939 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676914 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.676939 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676929 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.677028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.676973 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.677028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.677002 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.678935 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.678912 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.684961 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.684937 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsg7q\" (UniqueName: \"kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.807861 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.807765 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:56.939028 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:56.939002 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:51:56.941687 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:51:56.941659 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9173e6c_1a38_4bf3_b66d_c5698b3f56c3.slice/crio-9f6aad1ba56002c37402a6caa11f2f935ac66f9abbcab62be9b913989ce00fc5 WatchSource:0}: Error finding container 9f6aad1ba56002c37402a6caa11f2f935ac66f9abbcab62be9b913989ce00fc5: Status 404 returned error can't find the container with id 9f6aad1ba56002c37402a6caa11f2f935ac66f9abbcab62be9b913989ce00fc5 Apr 17 16:51:57.588138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:57.588103 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerStarted","Data":"b9ceb329489f3fe69935cac4d76fc3902f194b7529d67526527ab2998659f4fa"} Apr 17 16:51:57.588138 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:57.588141 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerStarted","Data":"9f6aad1ba56002c37402a6caa11f2f935ac66f9abbcab62be9b913989ce00fc5"} Apr 17 16:51:58.592294 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:58.592255 2561 generic.go:358] "Generic (PLEG): container finished" podID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerID="b9ceb329489f3fe69935cac4d76fc3902f194b7529d67526527ab2998659f4fa" exitCode=0 Apr 17 16:51:58.592684 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:58.592345 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerDied","Data":"b9ceb329489f3fe69935cac4d76fc3902f194b7529d67526527ab2998659f4fa"} Apr 17 16:51:59.602097 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:59.597836 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerStarted","Data":"7a4de0e4d22d700bf78742604d3d0c67d8d358ce4b3892dcab082b0ebb0ae60e"} Apr 17 16:51:59.602097 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:59.597881 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerStarted","Data":"3d877f67c44ab0604e550848f02bcbb9bca94257c9b10fff6b68fb338cd58d23"} Apr 17 16:51:59.602097 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:59.598708 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:51:59.620870 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:51:59.620814 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" podStartSLOduration=3.620797596 podStartE2EDuration="3.620797596s" podCreationTimestamp="2026-04-17 16:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:51:59.619335638 +0000 UTC m=+1267.517097093" watchObservedRunningTime="2026-04-17 16:51:59.620797596 +0000 UTC m=+1267.518559055" Apr 17 16:52:06.807966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:52:06.807927 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:52:06.807966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:52:06.807977 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:52:06.810917 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:52:06.810889 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:52:07.624652 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:52:07.624617 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:52:28.627851 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:52:28.627824 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:53:22.533176 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.533126 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:53:22.536515 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.536498 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.538768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.538743 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-8mmzt\"" Apr 17 16:53:22.538881 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.538743 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 16:53:22.547689 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.547664 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:53:22.678535 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678481 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cmcs\" (UniqueName: \"kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.678725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678555 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.678725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678578 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.678725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678604 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.678725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678620 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.678725 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.678696 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779037 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779006 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779055 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cmcs\" (UniqueName: \"kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779125 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779165 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779200 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779224 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779530 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779492 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779588 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779531 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779588 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779555 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.779754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.779631 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.782057 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.782036 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.787989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.787931 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cmcs\" (UniqueName: \"kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.848485 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.848448 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:22.977182 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.977140 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:53:22.980416 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:53:22.980383 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34336a8d_7a6a_49b9_b8a6_4b7f787e2451.slice/crio-4eacf29d5f096cba91109099cc5565f1a322e79bd0088679ab81c6dc5e329bc4 WatchSource:0}: Error finding container 4eacf29d5f096cba91109099cc5565f1a322e79bd0088679ab81c6dc5e329bc4: Status 404 returned error can't find the container with id 4eacf29d5f096cba91109099cc5565f1a322e79bd0088679ab81c6dc5e329bc4 Apr 17 16:53:22.982262 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:22.982242 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:53:23.866832 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:23.866796 2561 generic.go:358] "Generic (PLEG): container finished" podID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerID="0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157" exitCode=0 Apr 17 16:53:23.867261 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:23.866868 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerDied","Data":"0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157"} Apr 17 16:53:23.867261 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:23.866910 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerStarted","Data":"4eacf29d5f096cba91109099cc5565f1a322e79bd0088679ab81c6dc5e329bc4"} Apr 17 16:53:24.872006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:24.871971 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerStarted","Data":"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd"} Apr 17 16:53:24.872006 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:24.872004 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerStarted","Data":"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be"} Apr 17 16:53:24.872472 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:24.872238 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:24.892330 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:24.892278 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" podStartSLOduration=2.892261551 podStartE2EDuration="2.892261551s" podCreationTimestamp="2026-04-17 16:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:53:24.890509176 +0000 UTC m=+1352.788270633" watchObservedRunningTime="2026-04-17 16:53:24.892261551 +0000 UTC m=+1352.790023070" Apr 17 16:53:32.848694 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:32.848654 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:32.848694 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:32.848695 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:32.851599 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:32.851572 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:32.898450 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:32.898418 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:53.902276 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:53.902243 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:53:58.873644 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:58.873604 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:53:58.874091 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:58.873965 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="main" containerID="cri-o://3d877f67c44ab0604e550848f02bcbb9bca94257c9b10fff6b68fb338cd58d23" gracePeriod=30 Apr 17 16:53:58.874091 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:58.874015 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="tokenizer" containerID="cri-o://7a4de0e4d22d700bf78742604d3d0c67d8d358ce4b3892dcab082b0ebb0ae60e" gracePeriod=30 Apr 17 16:53:58.994112 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:58.994055 2561 generic.go:358] "Generic (PLEG): container finished" podID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerID="3d877f67c44ab0604e550848f02bcbb9bca94257c9b10fff6b68fb338cd58d23" exitCode=0 Apr 17 16:53:58.994238 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:53:58.994119 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerDied","Data":"3d877f67c44ab0604e550848f02bcbb9bca94257c9b10fff6b68fb338cd58d23"} Apr 17 16:54:00.000296 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.000266 2561 generic.go:358] "Generic (PLEG): container finished" podID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerID="7a4de0e4d22d700bf78742604d3d0c67d8d358ce4b3892dcab082b0ebb0ae60e" exitCode=0 Apr 17 16:54:00.000645 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.000346 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerDied","Data":"7a4de0e4d22d700bf78742604d3d0c67d8d358ce4b3892dcab082b0ebb0ae60e"} Apr 17 16:54:00.119196 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.119169 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:54:00.141134 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141103 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141169 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141238 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141289 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141273 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsg7q\" (UniqueName: \"kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141306 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141332 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location\") pod \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\" (UID: \"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3\") " Apr 17 16:54:00.141426 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141387 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:00.141562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141483 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:00.141617 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141582 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:00.141617 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141601 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:00.141764 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.141739 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:00.142207 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.142178 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:54:00.143525 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.143493 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:54:00.143625 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.143527 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q" (OuterVolumeSpecName: "kube-api-access-bsg7q") pod "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" (UID: "e9173e6c-1a38-4bf3-b66d-c5698b3f56c3"). InnerVolumeSpecName "kube-api-access-bsg7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:54:00.242388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.242300 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:00.242388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.242333 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsg7q\" (UniqueName: \"kubernetes.io/projected/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kube-api-access-bsg7q\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:00.242388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.242343 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:00.242388 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:00.242353 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:54:01.005754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.005714 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" event={"ID":"e9173e6c-1a38-4bf3-b66d-c5698b3f56c3","Type":"ContainerDied","Data":"9f6aad1ba56002c37402a6caa11f2f935ac66f9abbcab62be9b913989ce00fc5"} Apr 17 16:54:01.005754 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.005740 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl" Apr 17 16:54:01.006264 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.005773 2561 scope.go:117] "RemoveContainer" containerID="7a4de0e4d22d700bf78742604d3d0c67d8d358ce4b3892dcab082b0ebb0ae60e" Apr 17 16:54:01.014018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.013997 2561 scope.go:117] "RemoveContainer" containerID="3d877f67c44ab0604e550848f02bcbb9bca94257c9b10fff6b68fb338cd58d23" Apr 17 16:54:01.021240 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.021212 2561 scope.go:117] "RemoveContainer" containerID="b9ceb329489f3fe69935cac4d76fc3902f194b7529d67526527ab2998659f4fa" Apr 17 16:54:01.022683 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.022654 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:54:01.026871 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:01.026846 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenx9dl"] Apr 17 16:54:02.523398 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:02.523364 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" path="/var/lib/kubelet/pods/e9173e6c-1a38-4bf3-b66d-c5698b3f56c3/volumes" Apr 17 16:54:05.878307 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.878274 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:54:05.879255 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879230 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="tokenizer" Apr 17 16:54:05.879427 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879413 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="tokenizer" Apr 17 16:54:05.879529 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879517 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="storage-initializer" Apr 17 16:54:05.879624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879597 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="storage-initializer" Apr 17 16:54:05.879624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879617 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="main" Apr 17 16:54:05.879624 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879625 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="main" Apr 17 16:54:05.879856 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879730 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="main" Apr 17 16:54:05.879856 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.879746 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9173e6c-1a38-4bf3-b66d-c5698b3f56c3" containerName="tokenizer" Apr 17 16:54:05.884613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.884596 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.886839 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.886816 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-ch7cm\"" Apr 17 16:54:05.886954 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.886872 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 16:54:05.890278 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.890249 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:54:05.990151 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990114 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.990352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990183 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72tdc\" (UniqueName: \"kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.990352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990219 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.990352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990278 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.990352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990322 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:05.990352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:05.990348 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090710 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090680 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090730 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72tdc\" (UniqueName: \"kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090756 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090780 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090804 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.090904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.090832 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.091188 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.091135 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.091243 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.091203 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.091296 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.091247 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.091296 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.091262 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.093578 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.093551 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.098985 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.098958 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72tdc\" (UniqueName: \"kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.197656 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.197565 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:06.324148 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:06.324118 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:54:06.326908 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:54:06.326877 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode20c62cd_e603_45d1_97c4_551cd3b5e352.slice/crio-4340ee7cb9d8943593ee7d07df5b75cb509d656de483c06fc40e99fefac17fe0 WatchSource:0}: Error finding container 4340ee7cb9d8943593ee7d07df5b75cb509d656de483c06fc40e99fefac17fe0: Status 404 returned error can't find the container with id 4340ee7cb9d8943593ee7d07df5b75cb509d656de483c06fc40e99fefac17fe0 Apr 17 16:54:07.033884 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:07.033851 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerStarted","Data":"e9002fa63cf9058512c59a8036208575073221407a08253780e355eee1d8998b"} Apr 17 16:54:07.033884 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:07.033889 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerStarted","Data":"4340ee7cb9d8943593ee7d07df5b75cb509d656de483c06fc40e99fefac17fe0"} Apr 17 16:54:08.040931 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:08.040896 2561 generic.go:358] "Generic (PLEG): container finished" podID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerID="e9002fa63cf9058512c59a8036208575073221407a08253780e355eee1d8998b" exitCode=0 Apr 17 16:54:08.041347 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:08.040983 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerDied","Data":"e9002fa63cf9058512c59a8036208575073221407a08253780e355eee1d8998b"} Apr 17 16:54:09.047342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:09.047304 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerStarted","Data":"bded51f16a0c383d1d7cae9bc00e136541eae3bf36c646cd56600e39d861a03a"} Apr 17 16:54:09.047342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:09.047339 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerStarted","Data":"10454f3f7805697c71899b30eda258ef04c1d03f71b1ad42387b2bb015ff4846"} Apr 17 16:54:09.047865 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:09.047413 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:09.072245 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:09.072189 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" podStartSLOduration=4.072174256 podStartE2EDuration="4.072174256s" podCreationTimestamp="2026-04-17 16:54:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:54:09.070650885 +0000 UTC m=+1396.968412342" watchObservedRunningTime="2026-04-17 16:54:09.072174256 +0000 UTC m=+1396.969935711" Apr 17 16:54:16.197998 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:16.197962 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:16.198455 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:16.198132 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:16.200779 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:16.200757 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:17.084011 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:17.083979 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:54:39.089547 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:54:39.089467 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:55:13.469544 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:13.469502 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:55:13.470178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:13.469899 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="main" containerID="cri-o://634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be" gracePeriod=30 Apr 17 16:55:13.470178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:13.469967 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="tokenizer" containerID="cri-o://4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd" gracePeriod=30 Apr 17 16:55:13.901436 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:55:13.901394 2561 logging.go:55] [core] [Channel #402 SubChannel #403]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.35:9003", ServerName: "10.132.0.35:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.35:9003: connect: connection refused" Apr 17 16:55:14.268327 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.268233 2561 generic.go:358] "Generic (PLEG): container finished" podID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerID="634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be" exitCode=0 Apr 17 16:55:14.268327 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.268282 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerDied","Data":"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be"} Apr 17 16:55:14.734247 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.734215 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:55:14.877665 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877579 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cmcs\" (UniqueName: \"kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.877665 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877633 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.877665 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877653 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.877955 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877678 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.877955 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877700 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.877955 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877729 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp\") pod \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\" (UID: \"34336a8d-7a6a-49b9-b8a6-4b7f787e2451\") " Apr 17 16:55:14.878139 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877909 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.878139 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.877980 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.878260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.878160 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.878679 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.878652 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:14.879893 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.879867 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs" (OuterVolumeSpecName: "kube-api-access-6cmcs") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "kube-api-access-6cmcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:14.879893 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.879866 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34336a8d-7a6a-49b9-b8a6-4b7f787e2451" (UID: "34336a8d-7a6a-49b9-b8a6-4b7f787e2451"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:14.901260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.901212 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.35:9003\" within 1s: context deadline exceeded" Apr 17 16:55:14.979185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979143 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cmcs\" (UniqueName: \"kubernetes.io/projected/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kube-api-access-6cmcs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.979185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979172 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.979185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979183 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.979185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979192 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.979443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979201 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:14.979443 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:14.979209 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34336a8d-7a6a-49b9-b8a6-4b7f787e2451-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:15.273246 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.273142 2561 generic.go:358] "Generic (PLEG): container finished" podID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerID="4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd" exitCode=0 Apr 17 16:55:15.273246 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.273218 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" Apr 17 16:55:15.273476 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.273208 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerDied","Data":"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd"} Apr 17 16:55:15.273476 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.273343 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf" event={"ID":"34336a8d-7a6a-49b9-b8a6-4b7f787e2451","Type":"ContainerDied","Data":"4eacf29d5f096cba91109099cc5565f1a322e79bd0088679ab81c6dc5e329bc4"} Apr 17 16:55:15.273476 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.273366 2561 scope.go:117] "RemoveContainer" containerID="4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd" Apr 17 16:55:15.281701 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.281683 2561 scope.go:117] "RemoveContainer" containerID="634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be" Apr 17 16:55:15.289145 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.289125 2561 scope.go:117] "RemoveContainer" containerID="0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157" Apr 17 16:55:15.294566 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.294542 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:55:15.296837 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.296817 2561 scope.go:117] "RemoveContainer" containerID="4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd" Apr 17 16:55:15.297118 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:15.297097 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd\": container with ID starting with 4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd not found: ID does not exist" containerID="4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd" Apr 17 16:55:15.297185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.297126 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd"} err="failed to get container status \"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd\": rpc error: code = NotFound desc = could not find container \"4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd\": container with ID starting with 4b846774d2e289a2a9522871143af275b9d03bb53b20c3cc6c4376e1fb8994dd not found: ID does not exist" Apr 17 16:55:15.297185 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.297147 2561 scope.go:117] "RemoveContainer" containerID="634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be" Apr 17 16:55:15.297381 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:15.297362 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be\": container with ID starting with 634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be not found: ID does not exist" containerID="634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be" Apr 17 16:55:15.297447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.297390 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be"} err="failed to get container status \"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be\": rpc error: code = NotFound desc = could not find container \"634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be\": container with ID starting with 634b12c32ac0a1bc9766bac57590e33c7a4e41c7af4ef18fd35658c4e19582be not found: ID does not exist" Apr 17 16:55:15.297447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.297418 2561 scope.go:117] "RemoveContainer" containerID="0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157" Apr 17 16:55:15.297639 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:15.297617 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157\": container with ID starting with 0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157 not found: ID does not exist" containerID="0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157" Apr 17 16:55:15.297681 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.297646 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157"} err="failed to get container status \"0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157\": rpc error: code = NotFound desc = could not find container \"0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157\": container with ID starting with 0ee1e4f3fb82d7443acb260fd2326e1f1ab256149811d5206a57e3fc69a9f157 not found: ID does not exist" Apr 17 16:55:15.301300 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:15.301275 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schejkxtf"] Apr 17 16:55:16.522346 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:16.522313 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" path="/var/lib/kubelet/pods/34336a8d-7a6a-49b9-b8a6-4b7f787e2451/volumes" Apr 17 16:55:24.678274 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678230 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678534 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="tokenizer" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678545 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="tokenizer" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678565 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="main" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678570 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="main" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678582 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="storage-initializer" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678589 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="storage-initializer" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678637 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="tokenizer" Apr 17 16:55:24.678650 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.678646 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="34336a8d-7a6a-49b9-b8a6-4b7f787e2451" containerName="main" Apr 17 16:55:24.683559 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.683528 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.685909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.685886 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 17 16:55:24.686054 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.685951 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-mkxjm\"" Apr 17 16:55:24.695244 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.695216 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:55:24.759363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759328 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.759363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759370 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.759606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759399 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.759606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759452 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.759606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759545 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.759606 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.759586 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hqk9\" (UniqueName: \"kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860726 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860777 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860799 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860859 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860888 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hqk9\" (UniqueName: \"kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.860966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.860933 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.861281 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.861255 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.861345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.861277 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.861345 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.861322 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.861447 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.861366 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.863410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.863388 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.868821 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.868799 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hqk9\" (UniqueName: \"kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:24.997305 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:24.997207 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:25.135507 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:25.134565 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:55:25.138979 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:55:25.138914 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d00c91c_8b94_496b_b56f_a786d0ed7359.slice/crio-f50a27a0dd9dceea382153a162d7eba29ad224b4c92858800a8104947d9f8f26 WatchSource:0}: Error finding container f50a27a0dd9dceea382153a162d7eba29ad224b4c92858800a8104947d9f8f26: Status 404 returned error can't find the container with id f50a27a0dd9dceea382153a162d7eba29ad224b4c92858800a8104947d9f8f26 Apr 17 16:55:25.309301 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:25.309266 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerStarted","Data":"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821"} Apr 17 16:55:25.309301 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:25.309306 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerStarted","Data":"f50a27a0dd9dceea382153a162d7eba29ad224b4c92858800a8104947d9f8f26"} Apr 17 16:55:26.313814 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:26.313769 2561 generic.go:358] "Generic (PLEG): container finished" podID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerID="1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821" exitCode=0 Apr 17 16:55:26.314299 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:26.313814 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerDied","Data":"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821"} Apr 17 16:55:27.319116 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:27.319055 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerStarted","Data":"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7"} Apr 17 16:55:27.319490 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:27.319125 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerStarted","Data":"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5"} Apr 17 16:55:27.319490 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:27.319233 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:27.339678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:27.339628 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" podStartSLOduration=3.339613869 podStartE2EDuration="3.339613869s" podCreationTimestamp="2026-04-17 16:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:55:27.337244099 +0000 UTC m=+1475.235005553" watchObservedRunningTime="2026-04-17 16:55:27.339613869 +0000 UTC m=+1475.237375323" Apr 17 16:55:34.997945 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:34.997903 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:34.997945 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:34.997952 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:35.000729 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:35.000705 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:35.347100 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:35.347060 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:56.350837 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:56.350807 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:57.674710 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:57.674670 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:55:57.675162 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:57.675068 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="main" containerID="cri-o://b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5" gracePeriod=30 Apr 17 16:55:57.675494 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:57.675441 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="tokenizer" containerID="cri-o://dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7" gracePeriod=30 Apr 17 16:55:58.420790 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:58.420758 2561 generic.go:358] "Generic (PLEG): container finished" podID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerID="b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5" exitCode=0 Apr 17 16:55:58.420980 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:58.420831 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerDied","Data":"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5"} Apr 17 16:55:58.921234 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:58.921209 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:59.035276 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035190 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035276 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035243 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035276 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035269 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035309 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035325 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hqk9\" (UniqueName: \"kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035518 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035351 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds\") pod \"0d00c91c-8b94-496b-b56f-a786d0ed7359\" (UID: \"0d00c91c-8b94-496b-b56f-a786d0ed7359\") " Apr 17 16:55:59.035655 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035520 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:59.035716 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035683 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:59.035716 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.035703 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:59.036123 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.036068 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:55:59.037503 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.037480 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:55:59.037736 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.037529 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9" (OuterVolumeSpecName: "kube-api-access-2hqk9") pod "0d00c91c-8b94-496b-b56f-a786d0ed7359" (UID: "0d00c91c-8b94-496b-b56f-a786d0ed7359"). InnerVolumeSpecName "kube-api-access-2hqk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:55:59.136688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136649 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.136688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136679 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2hqk9\" (UniqueName: \"kubernetes.io/projected/0d00c91c-8b94-496b-b56f-a786d0ed7359-kube-api-access-2hqk9\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.136688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136689 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.136688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136700 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.136957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136709 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d00c91c-8b94-496b-b56f-a786d0ed7359-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.136957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.136718 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d00c91c-8b94-496b-b56f-a786d0ed7359-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:55:59.425487 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.425455 2561 generic.go:358] "Generic (PLEG): container finished" podID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerID="dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7" exitCode=0 Apr 17 16:55:59.425684 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.425531 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerDied","Data":"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7"} Apr 17 16:55:59.425684 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.425536 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" Apr 17 16:55:59.425684 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.425558 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj" event={"ID":"0d00c91c-8b94-496b-b56f-a786d0ed7359","Type":"ContainerDied","Data":"f50a27a0dd9dceea382153a162d7eba29ad224b4c92858800a8104947d9f8f26"} Apr 17 16:55:59.425684 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.425573 2561 scope.go:117] "RemoveContainer" containerID="dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7" Apr 17 16:55:59.433596 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.433576 2561 scope.go:117] "RemoveContainer" containerID="b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5" Apr 17 16:55:59.440744 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.440725 2561 scope.go:117] "RemoveContainer" containerID="1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821" Apr 17 16:55:59.447242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.447219 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:55:59.448127 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448107 2561 scope.go:117] "RemoveContainer" containerID="dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7" Apr 17 16:55:59.448379 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:59.448354 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7\": container with ID starting with dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7 not found: ID does not exist" containerID="dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7" Apr 17 16:55:59.448442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448381 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7"} err="failed to get container status \"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7\": rpc error: code = NotFound desc = could not find container \"dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7\": container with ID starting with dcfcad7ced6e434f259a0080ca85976dd06d220902e7014461e81bb0a1b53ca7 not found: ID does not exist" Apr 17 16:55:59.448442 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448399 2561 scope.go:117] "RemoveContainer" containerID="b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5" Apr 17 16:55:59.448631 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:59.448613 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5\": container with ID starting with b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5 not found: ID does not exist" containerID="b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5" Apr 17 16:55:59.448702 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448637 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5"} err="failed to get container status \"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5\": rpc error: code = NotFound desc = could not find container \"b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5\": container with ID starting with b1dd25fef697a7b92496bc37ea560fbc2ec98d962003c0ab6067d0db158a3dc5 not found: ID does not exist" Apr 17 16:55:59.448702 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448654 2561 scope.go:117] "RemoveContainer" containerID="1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821" Apr 17 16:55:59.448905 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:55:59.448887 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821\": container with ID starting with 1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821 not found: ID does not exist" containerID="1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821" Apr 17 16:55:59.449024 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.448908 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821"} err="failed to get container status \"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821\": rpc error: code = NotFound desc = could not find container \"1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821\": container with ID starting with 1921bab70215572f437b7547d6a90ad0310b20ba1c3e0a2bd8fc8a5609168821 not found: ID does not exist" Apr 17 16:55:59.451147 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:55:59.451126 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7bd46zssfj"] Apr 17 16:56:00.523594 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:00.523503 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" path="/var/lib/kubelet/pods/0d00c91c-8b94-496b-b56f-a786d0ed7359/volumes" Apr 17 16:56:07.138613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:07.138576 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:56:07.139115 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:07.138890 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="main" containerID="cri-o://10454f3f7805697c71899b30eda258ef04c1d03f71b1ad42387b2bb015ff4846" gracePeriod=30 Apr 17 16:56:07.139260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:07.139209 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="tokenizer" containerID="cri-o://bded51f16a0c383d1d7cae9bc00e136541eae3bf36c646cd56600e39d861a03a" gracePeriod=30 Apr 17 16:56:07.453434 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:07.453352 2561 generic.go:358] "Generic (PLEG): container finished" podID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerID="10454f3f7805697c71899b30eda258ef04c1d03f71b1ad42387b2bb015ff4846" exitCode=0 Apr 17 16:56:07.453434 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:07.453420 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerDied","Data":"10454f3f7805697c71899b30eda258ef04c1d03f71b1ad42387b2bb015ff4846"} Apr 17 16:56:08.465769 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.465736 2561 generic.go:358] "Generic (PLEG): container finished" podID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerID="bded51f16a0c383d1d7cae9bc00e136541eae3bf36c646cd56600e39d861a03a" exitCode=0 Apr 17 16:56:08.466246 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.465813 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerDied","Data":"bded51f16a0c383d1d7cae9bc00e136541eae3bf36c646cd56600e39d861a03a"} Apr 17 16:56:08.506758 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.506734 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:56:08.618941 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.618911 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72tdc\" (UniqueName: \"kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.618953 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619002 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619157 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619126 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619341 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619203 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619341 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619223 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:08.619341 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619326 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs\") pod \"e20c62cd-e603-45d1-97c4-551cd3b5e352\" (UID: \"e20c62cd-e603-45d1-97c4-551cd3b5e352\") " Apr 17 16:56:08.619494 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619423 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:08.619551 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619529 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:08.619607 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619585 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:08.619662 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619606 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:08.620016 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.619991 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:56:08.621580 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.621553 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc" (OuterVolumeSpecName: "kube-api-access-72tdc") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "kube-api-access-72tdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:56:08.621904 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.621887 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e20c62cd-e603-45d1-97c4-551cd3b5e352" (UID: "e20c62cd-e603-45d1-97c4-551cd3b5e352"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:56:08.720572 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.720537 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e20c62cd-e603-45d1-97c4-551cd3b5e352-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:08.720572 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.720568 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72tdc\" (UniqueName: \"kubernetes.io/projected/e20c62cd-e603-45d1-97c4-551cd3b5e352-kube-api-access-72tdc\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:08.720572 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.720577 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:08.720787 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:08.720586 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e20c62cd-e603-45d1-97c4-551cd3b5e352-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:56:09.470957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.470926 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" event={"ID":"e20c62cd-e603-45d1-97c4-551cd3b5e352","Type":"ContainerDied","Data":"4340ee7cb9d8943593ee7d07df5b75cb509d656de483c06fc40e99fefac17fe0"} Apr 17 16:56:09.470957 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.470949 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk" Apr 17 16:56:09.471463 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.470973 2561 scope.go:117] "RemoveContainer" containerID="bded51f16a0c383d1d7cae9bc00e136541eae3bf36c646cd56600e39d861a03a" Apr 17 16:56:09.479091 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.478948 2561 scope.go:117] "RemoveContainer" containerID="10454f3f7805697c71899b30eda258ef04c1d03f71b1ad42387b2bb015ff4846" Apr 17 16:56:09.486059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.486041 2561 scope.go:117] "RemoveContainer" containerID="e9002fa63cf9058512c59a8036208575073221407a08253780e355eee1d8998b" Apr 17 16:56:09.492178 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.492151 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:56:09.495662 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:09.495638 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-757767dzqk"] Apr 17 16:56:10.522851 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:10.522817 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" path="/var/lib/kubelet/pods/e20c62cd-e603-45d1-97c4-551cd3b5e352/volumes" Apr 17 16:56:15.955300 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955264 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955550 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="storage-initializer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955560 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="storage-initializer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955569 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="main" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955575 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="main" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955585 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="storage-initializer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955590 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="storage-initializer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955598 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="tokenizer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955603 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="tokenizer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955615 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="tokenizer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955620 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="tokenizer" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955627 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="main" Apr 17 16:56:15.955658 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955632 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="main" Apr 17 16:56:15.956018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955681 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="main" Apr 17 16:56:15.956018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955694 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="tokenizer" Apr 17 16:56:15.956018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955703 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="e20c62cd-e603-45d1-97c4-551cd3b5e352" containerName="tokenizer" Apr 17 16:56:15.956018 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.955711 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d00c91c-8b94-496b-b56f-a786d0ed7359" containerName="main" Apr 17 16:56:15.960369 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.960347 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:15.962816 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.962789 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 16:56:15.963507 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.963488 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-ld29h\"" Apr 17 16:56:15.963654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.963540 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:56:15.963654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.963566 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:56:15.963654 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.963627 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-w2fpl\"" Apr 17 16:56:15.967639 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:15.967609 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:56:16.080410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080370 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.080410 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080412 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.080613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080436 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.080613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080508 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqvmt\" (UniqueName: \"kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.080613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080547 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.080613 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.080570 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181188 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181149 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181188 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181188 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181232 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqvmt\" (UniqueName: \"kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181265 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181294 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181431 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181339 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181634 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181586 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181688 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181641 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181739 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181716 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.181739 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.181714 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.183693 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.183672 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.189985 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.189959 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqvmt\" (UniqueName: \"kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt\") pod \"router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.269838 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.269727 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:16.399110 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.399001 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:56:16.401780 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:56:16.401748 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9716eb3f_b657_47f3_ad8c_fca357dad25d.slice/crio-bf31a3e2312d07ced7bd4e46da24ba4e3e84908594d26725d9a8bd54d4c10ae1 WatchSource:0}: Error finding container bf31a3e2312d07ced7bd4e46da24ba4e3e84908594d26725d9a8bd54d4c10ae1: Status 404 returned error can't find the container with id bf31a3e2312d07ced7bd4e46da24ba4e3e84908594d26725d9a8bd54d4c10ae1 Apr 17 16:56:16.496872 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.496838 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerStarted","Data":"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c"} Apr 17 16:56:16.496872 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:16.496874 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerStarted","Data":"bf31a3e2312d07ced7bd4e46da24ba4e3e84908594d26725d9a8bd54d4c10ae1"} Apr 17 16:56:17.501600 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:17.501512 2561 generic.go:358] "Generic (PLEG): container finished" podID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerID="743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c" exitCode=0 Apr 17 16:56:17.501946 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:17.501599 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerDied","Data":"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c"} Apr 17 16:56:18.506445 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:18.506406 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerStarted","Data":"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e"} Apr 17 16:56:18.506445 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:18.506449 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerStarted","Data":"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea"} Apr 17 16:56:18.507009 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:18.506536 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:18.528044 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:18.527984 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" podStartSLOduration=3.527968955 podStartE2EDuration="3.527968955s" podCreationTimestamp="2026-04-17 16:56:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:56:18.526478127 +0000 UTC m=+1526.424239583" watchObservedRunningTime="2026-04-17 16:56:18.527968955 +0000 UTC m=+1526.425730410" Apr 17 16:56:26.270249 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:26.270197 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:26.270768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:26.270261 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:26.272950 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:26.272918 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:26.535034 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:26.534956 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:56:47.539498 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:56:47.539468 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:58:07.224360 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:07.224274 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:58:07.224920 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:07.224582 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="main" containerID="cri-o://be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea" gracePeriod=30 Apr 17 16:58:07.224920 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:07.224623 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="tokenizer" containerID="cri-o://fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e" gracePeriod=30 Apr 17 16:58:07.537874 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:58:07.537788 2561 logging.go:55] [core] [Channel #506 SubChannel #507]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.38:9003", ServerName: "10.132.0.38:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.38:9003: connect: connection refused" Apr 17 16:58:07.864261 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:07.864226 2561 generic.go:358] "Generic (PLEG): container finished" podID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerID="be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea" exitCode=0 Apr 17 16:58:07.864439 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:07.864284 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerDied","Data":"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea"} Apr 17 16:58:08.475609 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.475582 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:58:08.538211 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.538133 2561 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.38:9003\" within 1s: context deadline exceeded" Apr 17 16:58:08.574351 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574318 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rf8rk/must-gather-j66ht"] Apr 17 16:58:08.574628 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574615 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="storage-initializer" Apr 17 16:58:08.574678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574632 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="storage-initializer" Apr 17 16:58:08.574678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574655 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="tokenizer" Apr 17 16:58:08.574678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574661 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="tokenizer" Apr 17 16:58:08.574678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574670 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="main" Apr 17 16:58:08.574678 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574675 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="main" Apr 17 16:58:08.574828 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574724 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="tokenizer" Apr 17 16:58:08.574828 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.574736 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerName="main" Apr 17 16:58:08.577769 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.577746 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.579279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579253 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579299 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579314 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579342 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqvmt\" (UniqueName: \"kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579381 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579366 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579571 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579384 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs\") pod \"9716eb3f-b657-47f3-ad8c-fca357dad25d\" (UID: \"9716eb3f-b657-47f3-ad8c-fca357dad25d\") " Apr 17 16:58:08.579571 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579551 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:08.579571 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579562 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:08.579768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579676 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:08.579768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579689 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-uds\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.579768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579710 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-cache\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.580242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.579978 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rf8rk\"/\"openshift-service-ca.crt\"" Apr 17 16:58:08.580242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.580017 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rf8rk\"/\"default-dockercfg-vn5xt\"" Apr 17 16:58:08.580242 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.580061 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rf8rk\"/\"kube-root-ca.crt\"" Apr 17 16:58:08.580418 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.580235 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:08.581716 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.581695 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:58:08.581799 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.581776 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt" (OuterVolumeSpecName: "kube-api-access-tqvmt") pod "9716eb3f-b657-47f3-ad8c-fca357dad25d" (UID: "9716eb3f-b657-47f3-ad8c-fca357dad25d"). InnerVolumeSpecName "kube-api-access-tqvmt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:08.586059 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.586035 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rf8rk/must-gather-j66ht"] Apr 17 16:58:08.680735 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680703 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2t7\" (UniqueName: \"kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.680735 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680736 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.680963 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680773 2561 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-tokenizer-tmp\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.680963 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680789 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tqvmt\" (UniqueName: \"kubernetes.io/projected/9716eb3f-b657-47f3-ad8c-fca357dad25d-kube-api-access-tqvmt\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.680963 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680803 2561 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9716eb3f-b657-47f3-ad8c-fca357dad25d-kserve-provision-location\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.680963 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.680814 2561 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9716eb3f-b657-47f3-ad8c-fca357dad25d-tls-certs\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:08.782067 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.782017 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2t7\" (UniqueName: \"kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.782067 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.782066 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.782385 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.782369 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.792501 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.792454 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2t7\" (UniqueName: \"kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7\") pod \"must-gather-j66ht\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:08.868745 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.868715 2561 generic.go:358] "Generic (PLEG): container finished" podID="9716eb3f-b657-47f3-ad8c-fca357dad25d" containerID="fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e" exitCode=0 Apr 17 16:58:08.868927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.868792 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerDied","Data":"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e"} Apr 17 16:58:08.868927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.868806 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" Apr 17 16:58:08.868927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.868825 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv" event={"ID":"9716eb3f-b657-47f3-ad8c-fca357dad25d","Type":"ContainerDied","Data":"bf31a3e2312d07ced7bd4e46da24ba4e3e84908594d26725d9a8bd54d4c10ae1"} Apr 17 16:58:08.868927 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.868840 2561 scope.go:117] "RemoveContainer" containerID="fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e" Apr 17 16:58:08.876770 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.876744 2561 scope.go:117] "RemoveContainer" containerID="be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea" Apr 17 16:58:08.883902 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.883883 2561 scope.go:117] "RemoveContainer" containerID="743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c" Apr 17 16:58:08.890986 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.890963 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:58:08.891089 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.890967 2561 scope.go:117] "RemoveContainer" containerID="fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e" Apr 17 16:58:08.891298 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:58:08.891275 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e\": container with ID starting with fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e not found: ID does not exist" containerID="fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e" Apr 17 16:58:08.891363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.891310 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e"} err="failed to get container status \"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e\": rpc error: code = NotFound desc = could not find container \"fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e\": container with ID starting with fd169ec5473295138bd7d455f658bc50cb2be7bdbac918d47b80a84c7e24641e not found: ID does not exist" Apr 17 16:58:08.891363 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.891327 2561 scope.go:117] "RemoveContainer" containerID="be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea" Apr 17 16:58:08.891562 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:58:08.891544 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea\": container with ID starting with be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea not found: ID does not exist" containerID="be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea" Apr 17 16:58:08.891604 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.891568 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea"} err="failed to get container status \"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea\": rpc error: code = NotFound desc = could not find container \"be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea\": container with ID starting with be96a7e2755bdc19480c1054758beb499ecb55afdf1a411e83f3e59f24cdd5ea not found: ID does not exist" Apr 17 16:58:08.891604 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.891584 2561 scope.go:117] "RemoveContainer" containerID="743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c" Apr 17 16:58:08.891821 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:58:08.891803 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c\": container with ID starting with 743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c not found: ID does not exist" containerID="743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c" Apr 17 16:58:08.891864 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.891826 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c"} err="failed to get container status \"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c\": rpc error: code = NotFound desc = could not find container \"743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c\": container with ID starting with 743b8bda0a779c2122ed895a9d1b24af1bd29fe5e030a7627edb09fd9fc5991c not found: ID does not exist" Apr 17 16:58:08.895222 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.895198 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-c7fffbf4dw78xv"] Apr 17 16:58:08.898865 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:08.898850 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:09.015515 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:09.015370 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rf8rk/must-gather-j66ht"] Apr 17 16:58:09.017517 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:58:09.017485 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd5aacd_100d_40f8_a5bc_140a4e2a08b6.slice/crio-3b745b5988e631c5158a864b3feffff9fc66d16825947083556a7bd83b48bf45 WatchSource:0}: Error finding container 3b745b5988e631c5158a864b3feffff9fc66d16825947083556a7bd83b48bf45: Status 404 returned error can't find the container with id 3b745b5988e631c5158a864b3feffff9fc66d16825947083556a7bd83b48bf45 Apr 17 16:58:09.875304 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:09.875265 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf8rk/must-gather-j66ht" event={"ID":"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6","Type":"ContainerStarted","Data":"3b745b5988e631c5158a864b3feffff9fc66d16825947083556a7bd83b48bf45"} Apr 17 16:58:10.524990 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:10.524955 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9716eb3f-b657-47f3-ad8c-fca357dad25d" path="/var/lib/kubelet/pods/9716eb3f-b657-47f3-ad8c-fca357dad25d/volumes" Apr 17 16:58:13.890356 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:13.890313 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf8rk/must-gather-j66ht" event={"ID":"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6","Type":"ContainerStarted","Data":"a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be"} Apr 17 16:58:13.890356 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:13.890358 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf8rk/must-gather-j66ht" event={"ID":"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6","Type":"ContainerStarted","Data":"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f"} Apr 17 16:58:13.913506 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:13.913450 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rf8rk/must-gather-j66ht" podStartSLOduration=2.08689781 podStartE2EDuration="5.913433398s" podCreationTimestamp="2026-04-17 16:58:08 +0000 UTC" firstStartedPulling="2026-04-17 16:58:09.019065409 +0000 UTC m=+1636.916826842" lastFinishedPulling="2026-04-17 16:58:12.845600994 +0000 UTC m=+1640.743362430" observedRunningTime="2026-04-17 16:58:13.912754958 +0000 UTC m=+1641.810516414" watchObservedRunningTime="2026-04-17 16:58:13.913433398 +0000 UTC m=+1641.811194851" Apr 17 16:58:37.086567 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:37.086535 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-k5849_9df89ec2-ebe9-4433-b760-9c3d2994fe9c/discovery/0.log" Apr 17 16:58:37.922635 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:37.922598 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-k5849_9df89ec2-ebe9-4433-b760-9c3d2994fe9c/discovery/0.log" Apr 17 16:58:38.751958 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:38.751930 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-7xpg5_f0abb617-8d1d-497b-a1e3-af010c9b5798/authorino/0.log" Apr 17 16:58:38.767485 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:38.767459 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-q7jmp_d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06/manager/0.log" Apr 17 16:58:38.782568 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:38.782540 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-gd8tq_36b7257f-c549-4ab8-bae6-f19e9c6af6ca/manager/0.log" Apr 17 16:58:38.861989 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:38.861956 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-xb4wn_753159f9-d923-44a6-83cf-0139114f5f7d/manager/0.log" Apr 17 16:58:39.981275 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:39.981238 2561 generic.go:358] "Generic (PLEG): container finished" podID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerID="fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f" exitCode=0 Apr 17 16:58:39.981675 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:39.981290 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf8rk/must-gather-j66ht" event={"ID":"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6","Type":"ContainerDied","Data":"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f"} Apr 17 16:58:39.981675 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:39.981571 2561 scope.go:117] "RemoveContainer" containerID="fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f" Apr 17 16:58:40.528432 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:40.528403 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf8rk_must-gather-j66ht_6cd5aacd-100d-40f8-a5bc-140a4e2a08b6/gather/0.log" Apr 17 16:58:43.891139 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:43.891106 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gpb6x_b94229cb-1bfa-4a97-b64a-bde473f3d9e0/global-pull-secret-syncer/0.log" Apr 17 16:58:44.023778 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:44.023744 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hvbjq_46096eec-9a89-49c4-a719-dad5e2d71c2f/konnectivity-agent/0.log" Apr 17 16:58:44.077153 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:44.077119 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-35.ec2.internal_d11f5615e8652fc4b9c2e3bdd1d65617/haproxy/0.log" Apr 17 16:58:46.003008 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.002974 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rf8rk/must-gather-j66ht"] Apr 17 16:58:46.003411 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.003208 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-rf8rk/must-gather-j66ht" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="copy" containerID="cri-o://a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be" gracePeriod=2 Apr 17 16:58:46.007394 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.006913 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rf8rk/must-gather-j66ht"] Apr 17 16:58:46.234198 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.234172 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf8rk_must-gather-j66ht_6cd5aacd-100d-40f8-a5bc-140a4e2a08b6/copy/0.log" Apr 17 16:58:46.234560 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.234546 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:46.236569 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.236547 2561 status_manager.go:895] "Failed to get status for pod" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" pod="openshift-must-gather-rf8rk/must-gather-j66ht" err="pods \"must-gather-j66ht\" is forbidden: User \"system:node:ip-10-0-130-35.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-rf8rk\": no relationship found between node 'ip-10-0-130-35.ec2.internal' and this object" Apr 17 16:58:46.405648 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.405618 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output\") pod \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " Apr 17 16:58:46.405648 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.405656 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2t7\" (UniqueName: \"kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7\") pod \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\" (UID: \"6cd5aacd-100d-40f8-a5bc-140a4e2a08b6\") " Apr 17 16:58:46.407890 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.407852 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7" (OuterVolumeSpecName: "kube-api-access-lg2t7") pod "6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" (UID: "6cd5aacd-100d-40f8-a5bc-140a4e2a08b6"). InnerVolumeSpecName "kube-api-access-lg2t7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:58:46.411803 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.411770 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" (UID: "6cd5aacd-100d-40f8-a5bc-140a4e2a08b6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:58:46.506513 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.506473 2561 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-must-gather-output\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:46.506513 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.506513 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lg2t7\" (UniqueName: \"kubernetes.io/projected/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6-kube-api-access-lg2t7\") on node \"ip-10-0-130-35.ec2.internal\" DevicePath \"\"" Apr 17 16:58:46.523630 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:46.523594 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" path="/var/lib/kubelet/pods/6cd5aacd-100d-40f8-a5bc-140a4e2a08b6/volumes" Apr 17 16:58:47.006162 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.006134 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf8rk_must-gather-j66ht_6cd5aacd-100d-40f8-a5bc-140a4e2a08b6/copy/0.log" Apr 17 16:58:47.006576 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.006447 2561 generic.go:358] "Generic (PLEG): container finished" podID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerID="a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be" exitCode=143 Apr 17 16:58:47.006576 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.006514 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf8rk/must-gather-j66ht" Apr 17 16:58:47.006576 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.006542 2561 scope.go:117] "RemoveContainer" containerID="a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be" Apr 17 16:58:47.013966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.013946 2561 scope.go:117] "RemoveContainer" containerID="fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f" Apr 17 16:58:47.028131 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.028110 2561 scope.go:117] "RemoveContainer" containerID="a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be" Apr 17 16:58:47.028456 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:58:47.028435 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be\": container with ID starting with a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be not found: ID does not exist" containerID="a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be" Apr 17 16:58:47.028510 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.028466 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be"} err="failed to get container status \"a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be\": rpc error: code = NotFound desc = could not find container \"a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be\": container with ID starting with a8b9574a529ea70a8a478d62df72c9ac1b8fd80288a5a96a2ae20182387da7be not found: ID does not exist" Apr 17 16:58:47.028510 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.028484 2561 scope.go:117] "RemoveContainer" containerID="fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f" Apr 17 16:58:47.028766 ip-10-0-130-35 kubenswrapper[2561]: E0417 16:58:47.028741 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f\": container with ID starting with fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f not found: ID does not exist" containerID="fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f" Apr 17 16:58:47.028811 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:47.028775 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f"} err="failed to get container status \"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f\": rpc error: code = NotFound desc = could not find container \"fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f\": container with ID starting with fe5b7074c6c4a7475b78fc1058bb94b28c5532340f9050eded282f026d3fd09f not found: ID does not exist" Apr 17 16:58:48.036456 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:48.036420 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-7xpg5_f0abb617-8d1d-497b-a1e3-af010c9b5798/authorino/0.log" Apr 17 16:58:48.086844 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:48.086810 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-q7jmp_d694cd4e-5c4f-47bd-9ef8-702ad3dc4a06/manager/0.log" Apr 17 16:58:48.117352 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:48.117312 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-gd8tq_36b7257f-c549-4ab8-bae6-f19e9c6af6ca/manager/0.log" Apr 17 16:58:48.262435 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:48.262407 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-xb4wn_753159f9-d923-44a6-83cf-0139114f5f7d/manager/0.log" Apr 17 16:58:49.674153 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:49.674125 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8jh_5686acea-41ea-4c2e-a01a-b0faaf0e86e6/node-exporter/0.log" Apr 17 16:58:49.698329 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:49.698307 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8jh_5686acea-41ea-4c2e-a01a-b0faaf0e86e6/kube-rbac-proxy/0.log" Apr 17 16:58:49.723088 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:49.723059 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6n8jh_5686acea-41ea-4c2e-a01a-b0faaf0e86e6/init-textfile/0.log" Apr 17 16:58:52.668669 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.668637 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2"] Apr 17 16:58:52.669048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.668964 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="gather" Apr 17 16:58:52.669048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.668977 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="gather" Apr 17 16:58:52.669048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.668986 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="copy" Apr 17 16:58:52.669048 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.668993 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="copy" Apr 17 16:58:52.669195 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.669054 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="gather" Apr 17 16:58:52.669195 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.669064 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cd5aacd-100d-40f8-a5bc-140a4e2a08b6" containerName="copy" Apr 17 16:58:52.675270 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.675245 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.677768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.677744 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2whr7\"/\"default-dockercfg-8vtfj\"" Apr 17 16:58:52.678562 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.678543 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"kube-root-ca.crt\"" Apr 17 16:58:52.678682 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.678640 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2whr7\"/\"openshift-service-ca.crt\"" Apr 17 16:58:52.683914 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.683893 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2"] Apr 17 16:58:52.747966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.747929 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8cb\" (UniqueName: \"kubernetes.io/projected/7be9132e-1bc5-494c-be7b-98363ada4c66-kube-api-access-fx8cb\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.747966 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.747966 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-podres\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.748279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.747986 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-proc\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.748279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.748130 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-lib-modules\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.748279 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.748180 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-sys\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848563 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848519 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8cb\" (UniqueName: \"kubernetes.io/projected/7be9132e-1bc5-494c-be7b-98363ada4c66-kube-api-access-fx8cb\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848563 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848559 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-podres\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848584 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-proc\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848617 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-lib-modules\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848652 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-sys\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848709 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-proc\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848728 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-sys\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848733 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-podres\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.848820 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.848770 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7be9132e-1bc5-494c-be7b-98363ada4c66-lib-modules\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.857068 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.857048 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8cb\" (UniqueName: \"kubernetes.io/projected/7be9132e-1bc5-494c-be7b-98363ada4c66-kube-api-access-fx8cb\") pod \"perf-node-gather-daemonset-cdcc2\" (UID: \"7be9132e-1bc5-494c-be7b-98363ada4c66\") " pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:52.985929 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:52.985828 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:53.105992 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:53.105951 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2"] Apr 17 16:58:53.108620 ip-10-0-130-35 kubenswrapper[2561]: W0417 16:58:53.108584 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7be9132e_1bc5_494c_be7b_98363ada4c66.slice/crio-bc2bbfd6f93f8a4a08f64c699d631e80fc9892146ca9b88923d3e72e64a3b55c WatchSource:0}: Error finding container bc2bbfd6f93f8a4a08f64c699d631e80fc9892146ca9b88923d3e72e64a3b55c: Status 404 returned error can't find the container with id bc2bbfd6f93f8a4a08f64c699d631e80fc9892146ca9b88923d3e72e64a3b55c Apr 17 16:58:53.110170 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:53.110153 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:58:54.032587 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.032547 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" event={"ID":"7be9132e-1bc5-494c-be7b-98363ada4c66","Type":"ContainerStarted","Data":"03104b6484777e8b1fac61643f65f81e863890bdccd963ef1b66264d7499a4a8"} Apr 17 16:58:54.032587 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.032589 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" event={"ID":"7be9132e-1bc5-494c-be7b-98363ada4c66","Type":"ContainerStarted","Data":"bc2bbfd6f93f8a4a08f64c699d631e80fc9892146ca9b88923d3e72e64a3b55c"} Apr 17 16:58:54.033009 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.032658 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:58:54.050846 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.050805 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" podStartSLOduration=2.050790937 podStartE2EDuration="2.050790937s" podCreationTimestamp="2026-04-17 16:58:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:58:54.049874982 +0000 UTC m=+1681.947636437" watchObservedRunningTime="2026-04-17 16:58:54.050790937 +0000 UTC m=+1681.948552393" Apr 17 16:58:54.153969 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.153940 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhlb4_be864e9c-2445-4ad3-8453-a28d5bd5fda2/dns/0.log" Apr 17 16:58:54.211768 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.211743 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hhlb4_be864e9c-2445-4ad3-8453-a28d5bd5fda2/kube-rbac-proxy/0.log" Apr 17 16:58:54.341126 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.341094 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lpfgv_65e080a5-8430-43f5-b120-a3fff8102219/dns-node-resolver/0.log" Apr 17 16:58:54.857854 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:54.857825 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-chsm2_05559253-f52c-49e6-a8e0-1751350669ac/node-ca/0.log" Apr 17 16:58:55.724263 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:55.724229 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-k5849_9df89ec2-ebe9-4433-b760-9c3d2994fe9c/discovery/0.log" Apr 17 16:58:56.280533 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:56.280502 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hcv89_947405ad-c2f6-4581-b056-296308a2cc2f/serve-healthcheck-canary/0.log" Apr 17 16:58:56.848916 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:56.848886 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fckrp_74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37/kube-rbac-proxy/0.log" Apr 17 16:58:56.876854 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:56.876821 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fckrp_74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37/exporter/0.log" Apr 17 16:58:56.902288 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:56.902257 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fckrp_74a3f7f5-a863-48f7-bd64-3d1d9ca3fd37/extractor/0.log" Apr 17 16:58:59.643161 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:58:59.643130 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-nnj8z_446e45a1-b00b-469f-9068-28457cf96abf/openshift-lws-operator/0.log" Apr 17 16:59:00.045342 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:00.045268 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2whr7/perf-node-gather-daemonset-cdcc2" Apr 17 16:59:00.256644 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:00.256566 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-6df48d4859-mht88_7431d8d2-7cd4-4bc5-920c-37eceb31d473/manager/0.log" Apr 17 16:59:00.278498 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:00.278470 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-pcbrh_6b89a88d-91b8-4e59-adb8-3d132ceef8a1/server/0.log" Apr 17 16:59:06.769176 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.769149 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/kube-multus-additional-cni-plugins/0.log" Apr 17 16:59:06.793323 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.793293 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/egress-router-binary-copy/0.log" Apr 17 16:59:06.816444 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.816418 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/cni-plugins/0.log" Apr 17 16:59:06.843319 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.843291 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/bond-cni-plugin/0.log" Apr 17 16:59:06.865228 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.865196 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/routeoverride-cni/0.log" Apr 17 16:59:06.887206 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.887173 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/whereabouts-cni-bincopy/0.log" Apr 17 16:59:06.912909 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:06.912878 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-642q4_8fbceac7-0307-49d3-8986-e1b49a4b6760/whereabouts-cni/0.log" Apr 17 16:59:07.340260 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:07.340228 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgvnr_c3d773c7-2b0f-4dff-a3ec-72f61e88111c/kube-multus/0.log" Apr 17 16:59:07.474730 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:07.474701 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wdt9k_8a91f76e-d64e-4d72-92ff-c27c12f465d2/network-metrics-daemon/0.log" Apr 17 16:59:07.496671 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:07.496647 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wdt9k_8a91f76e-d64e-4d72-92ff-c27c12f465d2/kube-rbac-proxy/0.log" Apr 17 16:59:08.283588 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.283556 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/ovn-controller/0.log" Apr 17 16:59:08.310864 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.310833 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/ovn-acl-logging/0.log" Apr 17 16:59:08.335912 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.335885 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/kube-rbac-proxy-node/0.log" Apr 17 16:59:08.364253 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.364223 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:59:08.392520 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.392492 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/northd/0.log" Apr 17 16:59:08.415313 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.415292 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/nbdb/0.log" Apr 17 16:59:08.440090 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.440047 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/sbdb/0.log" Apr 17 16:59:08.541137 ip-10-0-130-35 kubenswrapper[2561]: I0417 16:59:08.541054 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qkt5m_f0164bdb-ef92-4743-91fc-03f010abe474/ovnkube-controller/0.log"